instruction
stringclasses 1
value | input
stringlengths 437
30.9k
| output
stringlengths 539
6.78k
|
---|---|---|
Please summarize the input | The context system which improves understanding of the V2X communication by V2-X receiver|1. To provide a method for correcting the action of a vehicle component based on V2X (Vehicle-to-Everything) communication, and to provide a method for correcting the action of a vehicle component. The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as an originator of the V2X message in the environment; and the V2X receiver is based on the context data; and the V2X transmitter is not sufficiently identified. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. The method. includes a step for correcting the action of the vehicle component of the V2X receiver,. a step for correcting the action.
| 2. The environment is a road environment including one's own vehicle and a remote vehicle which receives the V2X message; the context data further describes the context of the own vehicle in the road environment; and the method. described in claim 1 is described.
| 3. The V2X messages are; Wi-Fi (registered trademark) messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE (R)) messages; millimeter-wave communications messages; Bluetooth (R) messages; and not one of the satellite communications; method. described in claim 1.
| 4. The V2X messages are; Basic Safety Message (Basic Safety Message); method. in claim 1.
| 5. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; and the method. described in claim 1
| 6. The V2X transmitter is its own vehicle; the method. described in claim 1.
| 7. The V2X receiver is an autonomous vehicle; and the method. described in claim 1 is described.
| 8. The system which is included in a V2X receiver and corrects the action of a vehicle component based on V2X communication; a processor; and a non-temporary memory communicably connected to the processor; and when it is executed by the processor; and the processor is provided. The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as the origination source of the V2X message in the environment; and based on the context data, the V2X transmitter can be used for the transmission source of the V2X message. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. A system. comprises a step to correct the action of the vehicle component of the V2X receiver, and a computer code which can be stored action so as to execute the step; and a non-temporary memory.
| 9. The V2X messages are; Wi-Fi (registered trademark) messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE (R)) messages; millimeter-wave communications messages; Bluetooth (R) messages; and not one of the satellite communications; system. as described in claim 8
| 10. The V2X messages are; Basic Safety Message (Basic Safety Message); System. described in Claim 8
| 11. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; the system. described in claim 8.
| 12. The V2X transmitter is its own vehicle; the system. described in claim 8.
| 13. The V2X receiver is an autonomous vehicle; and the system. described in claim 8 is provided.
| 14. A action possible program to modify the action of a vehicle component based on V2X communication; that is, when executed by a processor of V2X receiver; The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as the origination source of the V2X message in the environment; and based on the context data, the V2X transmitter can be used for the transmission source of the V2X message. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. A program. is executed to include a step for correcting the action of the vehicle component of the V2-X receiver and a action for correcting the action.
| 15. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; and the program. described in claim 14. | The method involves receiving a V2X message including context data (191) that describes a context of a V2X transmitter in an environment. The V2X receiver is not sufficiently able to identify the V2X transmitter as an originator of the V2X message within the environment. The digital data describing an identity of the V2X transmitter is determined as the originator of the V2X message within the environment based on the context data. The operation of the vehicle component of the V2X receiver is modified based on the digital data describing the identity of the V2X transmitter. The V2X message is a Dedicated Short-Range Communication (DSRC) message. INDEPENDENT CLAIMS are included for the following:a system for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication; anda computer program product operable to modify an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. Method for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. Improves ability to identify the transmitter of different DSRC messages that would significantly improve the performance of a connected systems. The drawing shows a schematic block representation of an operating environment for a context system. 105Network125Processor146Communication unit150Sensor Set191Context data |
Please summarize the input | Vehicle-to-Everything data transmission for an automation vehicle|1. To provide digital data to a WiFi (R) router based on a mode of a communication unit by a communication unit of its own vehicle; or to attempt to provide it; and to provide a machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The method includes providing the digital data to a remote vehicle having no bandwidth constraint that prevents the digital data from being relayed to the server; or selecting an operation in a transmission mode including an attempt to be provided; and the operation in the transmit mode is performed. The connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the access ranking of the own vehicle are generated based on the connectability data of the one or more other end points. A first geographical position of the own vehicle and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. The method includes the steps of: based on the local dynamic map including the access ranking; relaying the digital data to the remote vehicle for transmission to the server.
| 2. The relay of the digital data to the remote vehicle is carried out further based on the presence of the prospect between the own vehicle and the remote vehicle, and the method is described in claim 1.
| 3. The system further includes a step of receiving corrective data operable to correct a vehicle component of the own vehicle; and the modified data causes a modification of the vehicle component by providing the digital data to the server. The method is determined on the basis of the digital data normally received by the server, and is described in the claim 1 or 2.
| 4. The own vehicle is an autonomous vehicle; the vehicle component is an advanced driver assistance system (ADAS system) corrected based on the correction data; and a method described in claim 3 is provided.
| 5. By the correction data, the real-time safety process of the autonomous vehicle provided by the ADAS system is corrected; and the method described in claim 4 is disclosed.
| 6. According to the correction data, the operation of the brake system of the own vehicle is corrected; the operation of the brake system is controlled by the ADAS system; and the method described in claim 4 is disclosed.
| 7. The WiFi router is directed to a home wireless network; and the connectivity access is (1) the WiFi router does not exist; and (2) the communication unit cannot wirelessly communicate with the WiFi router; and (2) the WiFi router is not available for wireless communication with the WiFi router. (3) The WiFi router is outside the transmission range of the communication unit; (4) the access to the wireless network of the WiFi router is insufficient; and (5) the WiFi router is not able to access the wireless network, or; or (6) The communication unit does not store the password of the WiFi router, and the communication unit is not authenticated so as to be wirelessly communicated with the WiFi router, based on a decision selected from a group including the; a method described in claim 1.
| 8. Ranking data describing one or more access ranking of one or more end points is acquired; a partner endpoint is identified based on the ranking data; and the sensor data are transmitted to the partner end point. is further included in the method described in claim 1.
| 9. The digital data include: the own vehicle; and the sensor data recorded by the set of the remote vehicle; and a method described in claim 8.
| 10. The own vehicle is in a reception mode before being switched to the transmission mode; and a method described in claim 1 is disclosed.
| 11. A communication unit is provided with an on-vehicle computer system of its own vehicle including a processor and a non-temporary memory for storing computer codes; and the computer code is executed by the processor; and when the computer code is executed by the processor, the computer code is received through the communication unit of the own vehicle. Based on the mode of the communication unit, it provides digital data to a WiFi (R) router, or attempts to provide it; and the machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The digital data are provided to the remote vehicle having no band width constraint preventing the digital data from being relayed to the server; or the operation in a transmission mode including the attempt to be provided is selected to the processor; and the processor is provided. In the operation in the transmission mode, the connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the connectivity data of the one or more other end points are based on the connectivity data of the one or more other end points. The access ranking of the own vehicle is generated; the first geographical position of the own vehicle; and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. Based on the local dynamic map including the access ranking, the system causes the processor to relay the digital data to the remote vehicle for transmission to the server.
| 12. The computer code is further performed to the processor to receive correction data operable to correct the vehicle component of the own vehicle when the computer code is executed by the processor; and the correction data are further performed. The system is determined on the basis of the digital data normally received by the server so as to correct the vehicle component by providing the digital data to the server; and the system described in claim 11.
| 13. The own vehicle is an autonomous vehicle; the vehicle component is an advanced driver assistance system (ADAS system) corrected based on the correction data; and a system described in claim 12 is provided.
| 14. The own vehicle is in a reception mode before being switched to the transmission mode; and the system is described in claim 13.
| 15. When executed by a processor, via the communication unit of the own vehicle, based on the mode of the communication unit, to provide digital data to a WiFi (R) router, or to provide it; and to provide it by the machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The digital data are provided to the remote vehicle having no band width constraint preventing the digital data from being relayed to the server; or the operation in a transmission mode including the attempt to be provided is selected to the processor; and the processor is provided. In the operation in the transmission mode, the connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the connectivity data of the one or more other end points are based on the connectivity data of the one or more other end points. The access ranking of the own vehicle is generated; the first geographical position of the own vehicle; and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. The program includes an instruction to cause the processor to relay the digital data to the remote vehicle for transmission to the server, based on the local dynamic map including the access ranking. | The method involves providing or attempting to provide, by a communication unit of an ego vehicle (123), digital data to a communication device based on a mode of the communication unit, in which the digital data is relayed by the communication device to be received by a server, then determining, by a processor of the ego vehicle, feedback that describes a bandwidth constraint of the communication unit, and modifying, by the processor, the mode based on the feedback so that the mode is consistent with the bandwidth constraint and the digital data is successfully received by the server. INDEPENDENT CLAIMS are also included for the following:a computer program product having instructions for vehicle-to-everything data transfer for automated vehicles; anda system for vehicle-to-everything data transfer for automated vehicles. Method for vehicle-to-everything data transfer for automated vehicles. The analysis module analyzes sensor data that is generated by a particular automated vehicle and generates modification data for the particular automated vehicle based on the sensor data. The modification data may cause a modification of a vehicle component when received by the automated vehicle, and the analysis module can generate modification data that helps the advanced driver assistance system to improve safety and efficiency of the ego vehicle. The drawing shows the block diagram illustrating an operating environment for a feedback system. 123Ego vehicle150Connectivity152Connectivity data set160Digital data199aFeedback system |
Please summarize the input | Vehicle component modification based on vehicle accident reproduction dataThe invention relates to vehicle assembly change based on vehicle accident reproduction data. The invention claims an embodiment of vehicle control system for vehicle accident reproduction (VAR) analysis, changing the connected vehicle. In some embodiments, a method comprises detecting the occurrence of collision associated with a remote vehicle; generating report data associated with collision of the remote vehicle, wherein the report data comprises one or more of V2X data describing all (V2X) wireless message received before the occurrence of the collision and event data describing one or more driving events observed by the self-vehicle; transmitting the report data to the server; receiving the change data from the server, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by VAR analysis, based on the report data, and changing the operation of the vehicle control system based on the change data so as to improve the safety of the self-vehicle.|1. A method for self-vehicle, the method comprising: generating a report data associated with a collision of a remote vehicle, wherein the report data includes a V2X data describing a first group of vehicles received from a remote vehicle before the occurrence of the collision and an event describing one or more driving events associated with a collision of a remote vehicle observed by one or more sensors of a self-vehicle collided adjacent to each other; both data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.
| 2. The method according to claim 1, wherein: The vehicle control system includes one of an advanced driver assistance system (ADAS system) and an autonomous driving system.
| 3. The method according to claim 2, wherein: the self-vehicle is an autonomous vehicle, and the operation of changing the vehicle control system based on the change data comprises: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle.
| 4. The method according to claim 2, wherein: The operation of changing the vehicle control system based on the change of the data includes: changing one or more operations of one or more of a brake system of a self-vehicle, a steering system and an acceleration system based on the change data, wherein the one or more operations are controlled by the ADAS system.
| 5. The method according to claim 1, further comprising: based on the first V2X wireless message received from the remote vehicle, identifying the presence of a pre-collision event associated with the remote vehicle; and in response to identifying the presence of a pre-crash event: recording sensor data for one or more sensors of the self-vehicle; based on sensor data, generating event data describing one or more driving events; and before detecting the occurrence of collision, monitoring the remote vehicle, to continue receiving one or more second V2X wireless message from the remote vehicle, wherein the first group of V2X wireless message contained in the report data comprises a first V2X wireless message and one or more second V2X wireless message, and The occurrence of a collision associated with a remote vehicle is detected.
| 6. The method according to claim 5, wherein identifying the presence of a pre-crash event associated with a remote vehicle comprises: receiving a set of initial V2X wireless messages from a set of remote vehicles through the network, wherein the set of initial V2X wireless messages comprises a first V2X wireless message from a remote vehicle; and analyzing the set of initial V2X wireless message, to identify the first V2X wireless message comprises data indicating event before collision associated with the remote vehicle.
| 7. The method according to claim 5, wherein detecting the occurrence of a collision associated with a remote vehicle comprises: based on the event data and one or more of the one or more second V2X wireless message, detecting the remote vehicle collision.
| 8. The method according to claim 5, wherein the event data comprises remote vehicle behaviour data describing one or more actions of the remote vehicle observed by the self-vehicle, and generating event data describing one or more driving events based on the sensor data further comprises: based on the sensor data, generating remote vehicle behaviour data describing one or more actions of the remote vehicle.
| 9. The method according to claim 8, wherein the one or more actions of the remote vehicle comprise one or more pre-collision behaviour of the remote vehicle observed by the self-vehicle.
| 10. The method according to claim 5, wherein: before detecting the occurrence of collision, the method further comprising: analyzing the sensor data to determine the event before the collision relates to the remote vehicle and as the end point of the other party involved in the collision; and monitoring the endpoint to receive one or more third V2X wireless messages from the endpoint; and generating event data describing one or more driving events based on the sensor data further comprises: based on the sensor data, generating endpoint behaviour data describing one or more actions of the endpoint observed by the self-vehicle.
| 11. The method according to claim 10, further comprising: based on one or more of the one or more third V2X wireless message and endpoint behaviour data, judging that the endpoint relates to the collision with the remote vehicle, wherein the first group of V2X wireless message contained in the report data further comprises one or more third V2X wireless message from the endpoint, and wherein the event data contained in the report data further comprises endpoint behaviour data.
| 12. The method according to claim 10, wherein the one or more actions of the endpoint include one or more pre-collision behaviour of the endpoint observed by a self-vehicle.
| 13. The method according to claim 1, wherein the event data contained in the report data further comprises one or more pre-collision conditions observed by the self-vehicle.
| 14. The method according to claim 1, wherein each of the V2X wireless messages in the first set of V2X wireless messages is selected from the group consisting of: a first set of V2X wireless messages, a second set of V2X wireless messages, and a second set of V2X wireless messages. a dedicated short range communication message; a basic security message; long term evolution LTE message; LTE-V2X message; 5G-V2X message; and a millimetre wave message.
| 15. A system for a self-vehicle, comprising: A vehicle-mounted computer system of self-vehicle, comprising a communication unit, a processor and a non-transitory memory storing computer code, when executed by the processor, the computer code causes the processor: generating a report data associated with a collision of a remote vehicle, wherein the report data comprises one or more driving events describing the V2X data of all V2X wireless messages from the first group of vehicles received by the remote vehicle before detecting the occurrence of the collision and the collision associated with the remote vehicle observed by one or more sensors of the self-vehicle collided by the adjacent vehicle; event data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.
| 16. The system according to claim 15, wherein: The vehicle control system includes one of an advanced driver assistance system (ADAS system) and an autonomous driving system.
| 17. The system according to claim 16, wherein: the self-vehicle is an autonomous vehicle, and the computer code when executed by a processor, the processor at least by the following operation based on changing data changing vehicle control system: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle.
| 18. The system according to claim 16, wherein: The computer code, when executed by a processor, causes the processor to change the operation of the vehicle control system by at least the following change data: changing one or more operations of one or more of a brake system of a self-vehicle, a steering system and an acceleration system based on the change data, wherein the one or more operations are controlled by the ADAS system.
| 19. A computer program product, wherein the computer program product comprises instructions that, when executed by a processor, cause the processor to operate, the operation comprising: generating a report data associated with a collision of a remote vehicle, wherein the report data comprises one or more driving events describing the V2X data of all V2X wireless messages from the first group of vehicles received by the remote vehicle before detecting the occurrence of the collision and the collision associated with the remote vehicle observed by one or more sensors of the self-vehicle collided by the adjacent vehicle; event data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.
| 20. The computer program product according to claim 19, wherein: the vehicle control system comprises an autonomous driving system, the self-vehicle is an autonomous vehicle, and the instructions, when executed by the processor, cause the processor to change the operation of the vehicle control system at least by the following change data: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle. | The method (300) involves detecting (301) an occurrence of a collision associated with a remote vehicle. A report data that is associated with the collision of the remote vehicle is generated (303). The report data includes vehicle-to-everything (V2X) data that describes a set of V2X wireless messages received prior to detecting the occurrence of the collision and event data that describes driving events that are observed by the ego vehicle. The report data is transmitted (305) to a server. The modification data that describes a modification for a vehicle control system of the ego vehicle is received (307) from the server. The modification data is generated based on the report data through a vehicular accident reconstruction analysis. An operation of the vehicle control system is modified (309) based on the modification data to improve safety of the ego vehicle. INDEPENDENT CLAIMS are included for the following:a system for modifying vehicle control system of connected vehicle; anda computer program product for modifying vehicle control system of connected vehicle. Method for modifying vehicle control system of connected vehicle based on vehicular accident reconstruction (VAR) analysis. The feedback system and the analysis system cooperate with one another to V2X data and sensor data recorded by the ego vehicle which is proximate to a crash to provide an easier and more accurate VAR analysis for the crash. The system where the ego vehicle is an autonomous vehicle and the computer code, when executed by the processor, causes the processor to modify the operation of the vehicle control system based on the modification data by modifying a safety process of the autonomous vehicle which is provided by the autonomous driving system based on the modification data to increases safety of the autonomous vehicle. Analyzing the VAR data to generate either design data or patch data that describes a modification for advanced driver assistance system (ADAS) system or an autonomous driving system have resulted in the collision being avoided or made the collision less likely to occur. The drawing shows a flowchart illustrating the method for modifying vehicle control system of connected vehicle based on result of VAR analysis. 300Method for modifying a vehicle control system of a connected vehicle based on a result of a VAR analysis301Step for detecting an occurrence of a collision associated with a remote vehicle303Step for generating report data associated with the collision of the remote vehicle305Step for transmitting the report data to a server307Step for receiving modification data that describes a modification for a vehicle control system of the ego vehicle from the server309Step for modifying an operation of the vehicle control system based on the modification data to improve safety of the ego vehicle |
Please summarize the input | Correction of a vehicle component based on Vehicle-to-Everything communication|1. A method performed by an own vehicle; and to generate received signal strength (RSS) data for describing an RSS value, for a first Vehicle-to-Everything (V2X) message, which is originated by a remote vehicle; and is transmitted by a remote vehicle. The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle, and to extract a set of sensor data including a second distance varying with time; and to provide a method for extracting the sensor data. By comparing the first distance with the lapse of time with the second distance with the lapse of time, it is determined that the second distance is inaccurate. The V2X message is based on the difference between the first distance and the second distance with the lapse of time, and it is determined that the vehicle has been transmitted by a computer determined by the own vehicle when inaccurate sensor data are transmitted. An advanced driving support system (ADAS) or an autonomous driving system of its own vehicle includes: correcting the operation of an ADAS or an autonomous operation system of its own vehicle so as not to consider sensor data provided by the remote vehicle.
| 2. The V2X message is a dedicated short-range communications message; the method described in claim 1.
| 3. The V2X messages are WiFi messages; 3G messages; 4G messages; 5G messages; Long-Term Evolution (LTE) messages; millimeter wave communications messages; Bluetooth messages; and satellite communications; The method described in claim 1 or 2.
| 4. The V2X message is received by a V2X wireless unit; the method described in any one of the claims 1 to 3.
| 5. The V2X radio device is not an element of the own vehicle; and the method described in claim 4 is disclosed.
| 6. The V2X radio device includes a plurality of channels including a reservation channel reserved for receiving the V2X message, and a method described in claim 4.
| 7. The reservation channel is reserved for a basic safety message; a method described in claim 6.
| 8. This is a processor communicably connected to a Vehicle-to-Everything (V2X) wireless unit and a non-transitory memory; wherein the V2X wireless unit is operable to receive V2-X messages; and the non-transitory memory stores a computer code. The processor is provided with a processor; the computer code is executed by the processor; and the processor is provided with the processor. For a first Vehicle-to-Everything (V2X) message originated by a remote vehicle, a received signal strength (RSS) data describing an RSS value is generated; The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle, and to extract a set of sensor data including a second distance varying with time; and to provide a method for extracting the sensor data. By comparing the first distance with the lapse of time with the second distance with the lapse of time, it is determined that the second distance is inaccurate. The V2X message is based on the difference between the first distance and the second distance with the lapse of time, and it is determined that the vehicle has been transmitted by a computer determined by the own vehicle when inaccurate sensor data are transmitted. An advanced driving support system (ADAS) or an autonomous driving system of its own vehicle is operable so as not to consider sensor data provided by the remote vehicle; to correct the operation of an ADAS or an autonomous driving system of its own vehicle; and to execute the execution of the ADAS or the autonomous driving system.
| 9. The V2X message is a dedicated short-range communications message; the system described in claim 8.
| 10. A program that causes the processor to execute its behavior when it is executed by a processor; the behavior of the processor is: For a first Vehicle-to-Everything (V2X) message originated by a remote vehicle, a received signal strength (RSS) data describing an RSS value is generated; The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle; a set of sensor data including a second distance varying with time is extracted; and the first distance is compared with the second distance with the lapse of time to compare the first distance with the second distance with the lapse of time. It is determined that the distance of 2 is inaccurate; and a V2X message is based on the difference between the first distance with the lapse of time and the second distance with the lapse of time; and the transmission of inaccurate sensor data is determined by a computer determined by the own vehicle. A program includes: an advanced driving support system (ADAS) or an autonomous driving system of its own vehicle; and a correction of an operation of an ADAS or an autonomous driving system of its own vehicle so as not to consider sensor data provided by the remote vehicle.
| 11. The operation of the ADAS or the autonomous operation system is corrected, so that the ADAS or the autonomous operation system does not consider the sensor data at the present time; and the operation of the ADAS or the autonomous operation system is corrected; and the program described in claim 10 is provided.
| 12. The operation of the ADAS or the autonomous operation system is corrected, and the operation of the ADAS or the autonomous operation system is corrected so as not to consider future sensor data received from the remote vehicle by the ADAS or the autonomous operation system. The program described in Claim 10 or 11.
| 13. The operation of the ADAS or the autonomous operation system is corrected, and the operation of the ADAS or the autonomous operation system is corrected so as not to consider the past sensor data received from the remote vehicle by the ADAS or the autonomous operation system. The program described in any one of the claims 10-12. | The method involves generating, by an ego vehicle (123), received signal strength (RSS) data describing an RSS value for a vehicle-to-everything (V2X) message originated by a remote vehicle (124). The range data corresponding to the RSS value is determined, which describes a first range from the ego vehicle to the remote vehicle. The ego vehicle determines that the remote vehicle is providing inaccurate sensor data by comparing the first range to a second range which is described by the sensor data which is extracted from the V2X message. An operation of a vehicle component of the ego vehicle is modified so that the vehicle component does not consider the sensor data that is provided by the remote vehicle. Method executed by ego vehicle for use in vehicle-to-everything communications. The method modifies the operation of the ego vehicle's autonomous driving system so that the inaccurate sensor data transmitted by a misbehaving endpoint is ignored in the future by the ego vehicle so that the inaccurate sensor data does not create a safety hazard. The method identifies misbehaving endpoints that repeatedly transmit inaccurate sensor data or taking steps to reduce any safety hazards that would be created by ego vehicles operating their autonomous driving systems. The drawing shows the block diagram of an operating environment for a modification system. 104Computer105Network123Ego vehicle124Remote vehicle199Modification system |
Please summarize the input | The detection of the driver who is taking the sleepiness based on a vehicle versus mono-communication|1. A second connected vehicle determines that a second driver of the second connected vehicle is taking a sleepiness of the vehicle. When the second driver sleepiness and the second connected vehicle is not in the automatic operation, the second connected vehicle transmits a V2X message to notify the sleepiness of the second driver. A first connected vehicle is configured to receive a V2X message transmitted from the second connected vehicle, and to determine whether the first connected vehicle is an autonomous vehicle or not. The first connected vehicle is not an autonomous vehicle; and the first connected vehicle is responsive to the reception of the V2X message notifying the sleepiness of the second driver; and the first connected vehicle is the first connected vehicle. A notification is provided to the first driver of the first connected vehicle. The first connected vehicle is an autonomous vehicle; and in response to receipt of the V2X message for informing drowsiness of the second driver, the first connected vehicle reduces the risk caused by the second driver. The method includes automatically avoiding an avoidance operation for avoiding the second connected vehicle by the first connected vehicle.
| 2. The V2X message is a dedicated narrow-area communications (DSRC) message; the method described in claim 1.
| 3. The V2X messages are WiFi messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE) messages; millimeter wave communications messages; Bluetooth messages; and one of the satellite communications; The method described in claim 1.
| 4. The V2X message is a basic safety message (BSM); a method described in claim 1.
| 5. The notification is a visual notification displayed on the head unit; and a small number of voice notifications; and a method described in claim 1.
| 6. A first processor of a first connected vehicle; a first non-temporary memory communicably connected to the first processor and storing a first computer code; a second processor of a second connected vehicle; and a second processor connected to the second processor in a communicable manner; and a second connected vehicle is connected to the first processor. The system includes a second non-temporary memory for storing a second computer code; and the second computer code determines that a second driver of the second connected vehicle has a drowsiness. When the second driver sleepiness and the second connected vehicle is not in the automatic operation, a V2X message notifying drowsiness of the second driver is transmitted; and the second processor executes the message; and the first computer code is executed. An V2X message transmitted from the second connected vehicle is received, and it is determined whether the own vehicle is an autonomous vehicle or not. The first connected vehicle is not an autonomous vehicle; and a notification is provided to a first driver of the first connected vehicle in response to reception of the V2X message for informing drowsiness of the second driver by the first connected vehicle. The first connected vehicle is an autonomous vehicle; and in response to receipt of the V2X message for informing drowsiness of the second driver, the first connected vehicle reduces the risk caused by the second driver. An avoidance operation for avoiding the second connected vehicle is automatically performed, and the system is executed to the first processor.
| 7. The notice is at least one of visual notices displayed on the head unit and voice notification; and the system described in claim 6. | The method involves receiving a Vehicle-to-Everything (V2X) message including digital data describing a path history of a first connected vehicle by a second connected vehicle. Determination is made that a driver of the first connected vehicle is drowsy based on the path history described by the digital data included in the V2X message by the second connected vehicle. A remedial action is executed to modify an operation of the second connected vehicle based on the driver of the first connected vehicle being drowsy by the second connected vehicle such that risk created by the driver is reduced, where the V2X message is selected from a group consisting of a Basic Safety Message, Dedicated Short-Range Communication (DSRC) message, Wi-Fi message, 3G message, 4G message, 5G message, Long-Term Evolution (LTE) message, mm wave communication message, Bluetooth message and a satellite communication. INDEPENDENT CLAIMS are also included for the following:a system for detecting presence of a drowsy driver of a vehicle based on V2X communicationsa computer program product comprising a set of instructions for detecting presence of a drowsy driver of a vehicle based on V2X communications. Method for detecting presence of a drowsy driver of a vehicle based on V2X communications. The method enables allowing a drowsy detection system to provide a safer driving environment and improve operation of a connected vehicle by assisting the connected vehicle to avoid drowsy drivers and reduce risk caused by drowsy drivers. The drawing shows a flow diagram illustrating a method for modifying operation of a connected vehicle to reduce risk caused by a drowsy driver. 301Step for transmitting Basic Safety Message303Step for receiving Basic Safety Message305Step for parsing out Basic Safety Message data from Basic Safety Message307Step for analyzing path history data included in Basic Safety Message data308Step for analyzing path history data |
Please summarize the input | The collision avoidance for the connected vehicle based on a digital action twin|1. In the collision avoidance method for a connected vehicle based on a digital action twin, the vehicle is recorded with digital data describing the driving state and the driving behavior of the remote vehicle and the own vehicle, and by the own vehicle. The first digital behavior twin of the remote vehicle; the second digital action twin of the own vehicle; and the digital data are provided to specify the risk of a collision involving one or more of the remote vehicle and the own vehicle; and to the own vehicle. The method includes correcting the operation of the own vehicle on the basis of the risk; wherein the first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The collision prevention method is a model for describing the driving behavior of the own driver of the own vehicle in one or more different driving conditions.
| 2. The method for preventing the collision of the own vehicle includes: correcting the operation of the own vehicle; and displaying a graphic output for visually depicting the risk by the electronic display of the own vehicle.
| 3. The graphic output is an augmented reality (AR) visualization showing a collision risk in each part of a road that is currently running or is going to travel, and is a collision prevention method described in claim 2.
| 4. The remote vehicle further includes transmission of a vehicle-to-mono (V2X) message to the own vehicle; the V2X message includes remote twin data describing the first digital behavior twin; and the collision-prevention method described in claim 1.
| 5. The one or more different operating conditions are based on the pattern of the remote driver which accelerates or decelerates according to the change of the traffic signal, and the collision prevention method described in claim 1.
| 6. The own vehicle is an autonomous vehicle; and the collision prevention method described in claim 1 is provided.
| 7. The action of the own vehicle is corrected, including the action of the own vehicle to autonomously correct the risk, and the collision prevention method described in the claim 6 is provided.
| 8. On the basis of new digital data describing different driving behaviors of the remote vehicle in the operating condition, the vehicle further includes correcting the first digital action twin by the own vehicle; and the like. The first digital behavior twin is modified to include the different driving behavior, and the collision prevention method described in claim 1 is disclosed.
| 9. A collision avoidance system for a connected vehicle based on a digital action twin; a non-temporary memory storing digital data describing an operation state and an operation behavior of a remote vehicle and its own vehicle; and a connected vehicle. This system is provided with a processor communicably connected to the non-temporary memory; and when the non-temporary memory is executed by the processor, the first digital action twin of the remote vehicle; and the second digital action twin of the own vehicle. On the basis of the digital data, the risk of collision involving one or more of the remote vehicle and the own vehicle is specified; and the operation of the own vehicle is corrected based on the risk. A computer code for making the processor perform is stored; the first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The collision avoidance system is a model for describing the driving behavior of the own driver of the own vehicle in one or more different operating conditions.
| 10. Correction of the operation of the own vehicle includes displaying a graphic output for visually depicting the risk by an electronic display of the own vehicle, and a collision avoidance system described in claim 9 is provided.
| 11. The graphic output is an augmented reality (AR) visualization showing a collision risk in each part of a road that is currently running or is going to travel by using a corresponding colored region, and a collision avoidance system described in claim 10.
| 12. The first digital action twin is received by the own vehicle; and is described by remote twin data received via a vehicle versus mono (V2X) message transmitted by the remote vehicle, and the collision avoidance system described in claim 9 is disclosed.
| 13. When executed by the processor, on the basis of new digital data describing the different driving behavior of the remote vehicle in the operating condition; The non-temporary memory stores an additional computer code for correcting the first digital behavior twin to the processor, and the first digital action twin is corrected so as to include the different driving behavior, and the collision avoidance system described in claim 9 is disclosed.
| 14. When executed by a processor, recording of digital data describing driving situation and driving behavior of the remote vehicle and the own vehicle in the operating condition; first digital action twin of the remote vehicle; second digital action twin of the own vehicle; Based on the digital data, the risk of collision involving one or more of the remote vehicle and the own vehicle is specified; and the operation of the own vehicle is corrected on the basis of the risk; and the processor is made to perform the above processor. The first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The program is a model describing the driving behavior of the own driver of the own vehicle in one or more different operating conditions. | The method involves recording digital data describing a driving context and a driving behavior of a remote vehicle and an ego vehicle in this driving context. A risk of a collision involving remote vehicle and the ego vehicle is determined based on a first digital behavioral twin of the remote vehicle, a second digital behavioral twin of the ego vehicle and the digital data. An operation of the ego vehicle is modified based on the risk. INDEPENDENT CLAIMS are included for the following:a system for providing digital twin service for real-world vehicle; anda computer program product providing digital twin service for real-world vehicle. Method for providing digital twin service for real-world vehicle. The twin client operates quickly and gives ego drivers more time to respond to dangerous situations, which increases driver safety for the ego driver and other drivers on the roadway. The digital behavioral twin system warns the ego driver before dangerous actions of other remote drivers are taken, which provides the ego driver more time to avoid collision. The augmented reality (AR) visualization provided by the twin client reduces mental fatigue of the ego driver by visualizing risks using more intuitive safe/unsafe regions, which is mentally processed by the ego driver using their sub-conscious. The driver must enable the automated system only when it is safe to do so. The driver does not need to adjust eye focus in order to view the projected image when an image is projected at the same three-dimensional position resulting in easy grasp of the projected image while looking at the real object. The drawing shows a block diagram illustrating an operating environment for a digital behavioral twin system and a twin client. 100Operating environment105Network107Digital twin server127A, 127BMemory199Digital behavioral twin system |
Please summarize the input | The traveling control system of an autonomous vehicle, a server device, and an autonomous vehiclePROBLEM TO BE SOLVED: To allow multiple autonomous travel vehicles to efficiently travel when multiple autonomous vehicles are traveling by mixing on the same lane.
SOLUTION: A travel control system of an autonomous travel vehicle for controlling multiple autonomous travel vehicles autonomously traveling in accordance with a predetermined operation command includes: priority order setting means for setting priority order between multiple autonomous travel vehicles; and travel control means for controlling traveling of multiple autonomous travel vehicles so as to prioritize traveling of an autonomous travel vehicle at higher priority order set by the priority order setting means compared with an autonomous travel vehicle at lower priority order set by the priority order setting means when multiple autonomous travel vehicles are traveling by mixing on the same lane.
SELECTED DRAWING: Figure 6|1. It is a traveling control system of an autonomous vehicle which controls several autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,
Comprising:
The priority setting means which sets the priority in between several autonomous vehicles,
When several autonomous vehicle mixes and drive|works into the same lane,
The travel controlling means which controls driving|running|working of several autonomous vehicle to give priority to driving|running|working of an autonomous vehicle with a high priority set by the said priority setting means compared with an autonomous vehicle with a low priority set by the said priority setting means,
The traveling control system of an autonomous vehicle provided with these.
| 2. The said priority setting means sets a priority higher than an autonomous vehicle with a low fee-charging charge with respect to a user to an autonomous vehicle with a high fee-charging charge with respect to a user,
The traveling control system of the autonomous vehicle of Claim 1.
| 3. The said priority setting means sets a priority higher than an autonomous vehicle with low travel speed to an autonomous vehicle with high travel speed,
The traveling control system of the autonomous vehicle of Claim 1.
| 4. It is a server device which controls several autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,
Comprising:
The priority setting means which sets the priority in between several autonomous vehicles,
When several autonomous vehicle which drive|works the same lane top is detected,
An instruction|command means to perform operation instruction|command to several autonomous vehicle so that priority may be given to driving|running|working of an autonomous vehicle with a high priority set by the said priority setting means compared with an autonomous vehicle with a low priority set by the said priority setting means,
A server device provided with these.
| 5. It is an autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,
Comprising:
The following vehicle detection means which detects the other autonomous vehicle which follows this own vehicle on the same lane as the own vehicle,
An acquisition means to acquire the information regarding the priority of this other autonomous vehicle by communicating from vehicle to vehicle with this other autonomous vehicle when said other autonomous vehicle is detected by the said following vehicle detection means,
The travel controlling means which will control driving|running|working of the own vehicle to give priority to driving|running|working of said other autonomous vehicle over the own vehicle if the priority of said other autonomous vehicle acquired by the said acquisition means is higher than the priority of the own vehicle,
An autonomous vehicle provided with these. | The traveling control system comprises a controller having at least one processor. The controller configured to set priorities (S14) among multiple autonomous traveling vehicles (100A,100B) controls traveling of multiple of autonomous traveling vehicles so that the autonomous traveling vehicle having a priority set to be high travels preferentially as compared with the autonomous traveling vehicle having the priority set to be low, if the multiple of autonomous traveling vehicles travel on an identical lane in a mixed manner. An INDEPENDENT CLAIM is included for a server apparatus for controlling multiple of autonomous traveling vehicles. Traveling control system for controlling autonomous traveling vehicle (Claimed). The traveling control system for controlling autonomous traveling vehicle suppresses the situation that the subject vehicle disturbs the smooth traveling of the following other autonomous traveling vehicle. The system allows the first autonomous traveling vehicle and the second autonomous traveling vehicle to travel in accordance with the respective needs. The drawing shows a flowchart for the flow of the data and the process performed between the respective constitutive components of the moving body system. 100A,100BAutonomous traveling vehicles200Server apparatusS10Position informationS14Set prioritiesS16Departure command |
Please summarize the input | V2-X receiver-oriented V2-X full duplex position pinpoint assistancePROBLEM TO BE SOLVED: To accurately localize a transmission source of a V2X message.
SOLUTION: The disclosure describes embodiments for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. A method includes receiving, by a remote vehicle, a request message inquiring what additional identifying information of an ego vehicle the remote vehicle requests so that the remote vehicle identifies the ego vehicle as a transmitter of a V2X message. The method includes transmitting, by the remote vehicle, a response message including response data that describes the additional identifying information of the ego vehicle that provides the remote vehicle with the capacity to identify the ego vehicle as the transmitter of the V2X message when the ego vehicle is among a plurality of vehicles. The method includes receiving, by the remote vehicle, the V2X message that includes assistance data describing the additional identifying information. The method includes modifying, by the remote vehicle, the operation of the vehicle component of the remote vehicle based on the assistance data.
SELECTED DRAWING: Figure 1A|1. It is a method to correct operation|movement of a vehicle component based on V2X(Vehicle-to-Everything) communication,
Comprising:
What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with a remote vehicle is a step which receives the request message which inquires of what kind of the said own vehicle it is additional identification information,
The step which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle,
The step which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle,
The method containing the step which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data.
| 2X of said V message is a DSRC (dedicated short range communications) message,
The method of Claim 1.
| 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,
The method of Claim 1.
| 2X of said V message contains the payload data which describe the said own vehicle,
The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level,
The said operation|movement of the said vehicle component is corrected based on the said payload data,
The method of Claim 1.
| 5. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,
The method of Claim 1.
| 6. The said request message is broadcast by the said own vehicle,
The said response message is unicasted with the said remote vehicle,
2X of said V message is broadcast by the said own vehicle,
The method of Claim 1.
| 7. The said remote vehicle is an autonomous vehicle,
The method of Claim 1.
| 8. It is a system which corrects operation|movement of a vehicle component based on V2-X communication,
Comprising:
The said system is contained in a remote vehicle,
Processor,
It is the non-temporary memory couple|bonded so as to be communicable with the said processor,
Comprising:
The above-mentioned non-temporary memory is the said processor when the said processor performs,
What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with the said remote vehicle is a step which receives the request message which inquires of what kind of the said own vehicle it is additional identification information,
The step which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle,
The step which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle,
The system provided with non-temporary memory which memorize|stores the computer code which can operate|move so that the step which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data may be performed.
| 2X of said V message is a DSRC (dedicated short range communications) message,
The system of Claim 8.
| 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,
The system of Claim 8.
| 2X of said V message contains the payload data which describe the said own vehicle,
The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level,
The said operation|movement of the said vehicle component is corrected based on the said payload data,
The system of Claim 8.
| 12. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,
The system of Claim 8.
| 13. The said request message is broadcast by the said own vehicle,
The said response message is unicasted with the said remote vehicle,
2X of said V message is broadcast by the said own vehicle,
The system of Claim 8.
| 14. The said remote vehicle is an autonomous vehicle,
The system of Claim 8.
| 15. As operation|movement of a vehicle component is corrected based on V2-X communication, they are the computer program products which can operate|move,
Comprising:
The said computer program product is the said processor when the processor of a remote vehicle performs,
What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with the said remote vehicle is operation|movement which receives the request message which inquires of what kind of the said own vehicle it is additional identification information,
Operation|movement which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle,
Operation|movement which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle,
Computer program products provided with the command which performs the operation|movement which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data.
| 2X of said V message is a DSRC (dedicated short range communications) message,
The computer program products of Claim 15.
| 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,
The computer program products of Claim 15.
| 2X of said V message contains the payload data which describe the said own vehicle,
The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level,
The said operation|movement of the said vehicle component is corrected based on the said payload data,
The computer program products of Claim 15.
| 19. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,
The computer program products of Claim 15.
| 20. The said request message is broadcast by the said own vehicle,
The said response message is unicasted with the said remote vehicle,
2X of said V message is broadcast by the said own vehicle,
The computer program products of Claim 15. | The method (300) involves receiving (307) a request message inquiring what additional identifying information of an ego vehicle the remote vehicle requests so that the remote vehicle has a capacity to identify the ego vehicle as a transmitter of a V2X message. A response message which includes response data that describes the additional identifying information of the ego vehicle that provides the remote vehicle with the capacity to identify the ego vehicle as the transmitter of the V2X message is transmitted by the remote vehicle. The V2X message that includes assistance data describing the additional identifying information is received. The operation of the vehicle component of the remote vehicle is modified by the remote vehicle based on the assistance data. INDEPENDENT CLAIMS are included for the following:a system for modifying operation of vehicle component based on vehicle-to-everything communication; anda computer program product for modifying operation of vehicle. Method for modifying operation of vehicle component of ego vehicle or remote vehicle based on vehicle-to-everything (V2X) communication. The improved ability to identify the transmitter of different dedicated short-range communication (DSRC) messages improve the performance of connected ADAS systems, and autonomous driving systems. The driver safely turns attention away from driving tasks and prepared to take control of the autonomous vehicle when needed. The drawing shows a flow chart illustrating the method for modifying operation of vehicle. 300Method for modifying operation of vehicle component301Step for recording the sensor data302Step for analyzing the sensor data305Step for broadcasting request message307Step for receiving request message |
Please summarize the input | SYSTEMS AND METHODS FOR ACTIVE ROAD SURFACE MAINTENANCE WITH CLOUD-BASED MOBILITY DIGITAL TWINAn active road surface maintenance system and method developed for connected vehicles with the aid of a mobility digital twin (MDT) framework. A method performed in a cloud-based digital space includes receiving data regarding a physical object from a physical space connected to a vehicle. The method also includes processing the data using machine learning to model road surface conditions, in which respective penalty values are assigned to corresponding road surfaces, a respective penalty value being higher the lower a condition of the corresponding road surface. The method also includes deriving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory. The method further includes transmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle.What is claimed is:
| 1. A method performed in a cloud-based digital space, comprising:
receiving data regarding a physical object from a physical space within which a vehicle is operating;
processing the data using machine learning to model road surface conditions, in which respective penalty values are assigned to corresponding road surfaces, a respective penalty value being inversely related to a condition of the corresponding road surface;
deriving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory; and
transmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle.
| 2. The method of claim 1, wherein the processing of the data further comprises storing the data in a data lake.
| 3. The method of claim 2, wherein the data lake further comprises stored historical data.
| 4. The method of claim 3, wherein the processing of the data includes processing of the stored historical data in addition to the stored data received from the physical space.
| 5. The method of claim 1, wherein the physical object comprises at least one of a vehicle, a human, and a traffic device.
| 6. The method of claim 1, wherein the data is collected by one or more sensors communicating with the physical object.
| 7. The method of claim 6, wherein the collected data is real-time information relating to one or more of the following: road surfaces, traffic flow, weather, ego vehicle, perception of neighboring vehicle, or occupant of ego vehicle.
| 8. The method of claim 1, wherein the received data is obtained from one or more monitoring devices associated with the physical object and/or from one or more vehicle-to-anything (V2X) communications regarding the physical object.
| 9. The method of claim 1, further comprising effecting the actuation of the vehicle along the trajectory when the vehicle is an autonomous vehicle, or prompting a human driver of the vehicle to drive along the trajectory when the vehicle is operated by the human driver.
| 10. The method of claim 1, further comprising:
when the vehicle is operated by a human driver, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane.
| 11. The method of claim 1, wherein the processing further comprises processing the data using machine learning and historical data to model the road surface conditions and predict future road surface conditions, and using the predicted future road surface conditions to target road surfaces for maintenance.
| 12. The method of claim 1, further comprising applying a fusion process to the received data and filtering out noisy data.
| 13. The method of claim 12, wherein the fusion process is effected using a Kalman filter-based sensor fusion algorithm.
| 14. A cloud-based system effectuating an end-to-end framework, comprising:
a cloud-based platform hosting one or more digital twins corresponding to one or more physical objects from a physical space within which a vehicle is operating, wherein one of the digital twins comprises a data lake and an active road maintenance microservice;
a communications layer communicatively connecting the one or more digital twins to the one or more physical objects, wherein:
the communications layer transmits data regarding the one or more physical objects to at least the one or more corresponding digital twins, and
the communications layer transmits instructions that have been derived from processing of the transmitted data by at least the active road maintenance microservice to the physical space connected to the vehicle; and
wherein the active road maintenance microservice:
processes the data using machine learning to model road surface conditions, in which a rewards function assigns respective rewards values to corresponding road surfaces, a respective rewards value corresponding to a condition of the corresponding road surface, and
derives the instructions based on the modeled road surface conditions and on optimizing the rewards function to guide actuation of the vehicle along a trajectory.
| 15. The cloud-based system of claim 14, wherein the one or more physical objects comprise at least one of a vehicle, a human, and a traffic device.
| 16. The cloud-based system of claim 14, wherein the data lake further comprises stored historical data, and wherein the processing of the data further comprises storing the transmitted data in the data lake and processing the stored historical data in addition to the stored transmitted data.
| 17. The cloud-based system of claim 14,
wherein the data is collected by one or more sensors communicating with the one or more physical objects,
wherein the sensors are equipped on one or more of roads, vehicles, vehicle occupants, or pedestrians, and
wherein the sensors are one or more of perception sensor, ultrasonic sensor, camera, radar, LIDAR, in-pavement surface temperature and condition sensor, in-pavement surface chemical and concentration sensor, wearable device, road surface sensor, in-cabin sensor, mobile app, loop detector, condition sensor.
| 18. The cloud-based system of claim 14, further comprising when the vehicle is a non-autonomous vehicle, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane.
| 19. A method performed in a cloud-based system effectuating an end-to-end framework, comprising:
in a digital space:
receiving data regarding a physical object from a physical space within which a vehicle is operating;
processing the data using machine learning to model road surface conditions, including using a rewards function to assign respective rewards values to corresponding road surfaces, a respective rewards value being related to a condition of the corresponding road surface;
deriving instructions based on optimizing the rewards function to guide actuation of the vehicle along a trajectory;
transmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle;
in the physical space:
receiving the transmitted instructions;
determining whether the vehicle is an autonomous vehicle or a non-autonomous vehicle; and
when the vehicle is an autonomous vehicle navigating the vehicle along the trajectory using the instructions, or when the vehicle is a non-autonomous vehicle prompting a human driver to navigate the vehicle along with trajectory using the instructions.
| 20. The method of claim 19, further comprising:
when the vehicle is a non-autonomous vehicle, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane. | The method (500) involves receiving data regarding a physical object from a physical space within which a vehicle i.e. car is operated (502), where the physical object comprises at least one vehicle, a human and a traffic device. The data is processed (504) using machine learning to model road surface conditions in which respective penalty values are assigned to corresponding road surfaces, where the respective penalty value is inversely related to a condition of corresponding road surface. Instructions are derived (506) based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory. The instructions are transmitted (508) to the physical space connected to the vehicle to guide actuation of the vehicle. The data is stored in a data lake, where the data lake further comprises stored historical data. Fusion process is effected by using a Kalman filter-based sensor fusion algorithm. INDEPENDENT CLAIMS are included for:(1) a cloud-based system effectuating an end-to-end framework;and(2) a method for performing active road surface maintenance in a cloud-based digital space for connected vehicles with aid of a mobility digital twin framework. Method for performing active road surface maintenance in cloud-based digital space for connected vehicles i.e. car with aid of mobility digital twin framework i.e. end-to-end framework. The method enables generating the guidance information by road digital twins and sent back to the connected vehicles in the real world, thus assisting the autonomous vehicles or the human drivers to drive in a certain way to avoid excessive loads on certain areas of the road surface in an effective manner. The drawing shows a flow diagram ilustrating a method for performing active road surface maintenance in cloud-based digital space for connected vehicles with aid of a mobility digital twin framework.500Method for performing active road surface maintenance in cloud-based digital space for connected vehicles with aid of a mobility digital twin framework 502Method for receiving data regarding a physical object from a physical space 504Method for processing data using machine learning to model road surface conditions 506Method for deriving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory 508Method for transmtting instructions to the physical space connected to the vehicle to guide actuation of the vehicle |
Please summarize the input | Controller and autonomous driving vehicleA controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle is provided. The controller includes: a reception device configured to receive a signal transmitted from a wireless communication device included in the autonomous driving vehicle; a first transmission device configured to transmit first control information to the wireless communication device included in the autonomous driving vehicle; and a second transmission device configured to transmit information to a satellite based on a reception condition of the signal from the autonomous driving vehicle received by the reception device, the information being used by the satellite to transmit second control information to a satellite communication device included in the autonomous driving vehicle.What is claimed is:
| 1. A controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle, comprising:
a receiver configured to receive a signal transmitted from a wireless communication device of the autonomous driving vehicle;
a first transmitter configured to transmit, when the signal is not interrupted, first control information to the wireless communication device of the autonomous driving vehicle, the first control information including a speed limit; and
a second transmitter configured to transmit, when the signal is interrupted, information to a satellite that is used by the satellite to transmit second control information configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, from the satellite directly to a satellite receiver of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information.
| 2. The controller according to claim 1, wherein:
the first control information is generated based on the signal from the autonomous driving vehicle received by the receiver; and
the second control information is generated based on a signal from a vehicle other than the autonomous driving vehicle received by the receiver.
| 3. A controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle, comprising:
a receiver configured to receive a signal transmitted from a wireless communication device of the autonomous driving vehicle;
a first transmitter configured to transmit, when the signal is not interrupted, first control information to the wireless communication device of the autonomous driving vehicle, the first control information including a speed limit; and
a second transmitter configured to transmit, when the signal is interrupted, information to another vehicle that is used by the other vehicle to transmit second control information configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, to a vehicle-to-vehicle communication device of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information.
| 4. An autonomous driving vehicle, comprising:
a transmitter configured to transmit a signal to a controller;
a receiver configured to receive first control information from the controller when the signal is not interrupted, the first control information including a speed limit; and
a satellite receiver configured to receive second control information from a satellite when the signal is interrupted, the second control information being control instructions configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information.
| 5. The autonomous driving vehicle according to claim 4, wherein the satellite receiver receives the second control information based on failure of transmission of the signal to the controller with use of the transmitter.
| 6. The autonomous driving vehicle according to claim 4, further comprising a first controller configured to generate the second control information based on the first control information. | The controller for transmitting control information necessary for autonomous driving vehicle (10), comprises reception device to receive a signal transmitted from a wireless communication device (50) included in the autonomous driving vehicle. A first transmission device transmit first control information to the wireless communication device included in the autonomous driving vehicle. A second transmission device transmit information to satellite based on a reception condition of the signal from the autonomous driving vehicle received by the reception device. The information is used by the satellite to transmit second control information to a satellite communication device included in the autonomous driving vehicle. The automated driving control system determines, whether or not autonomous driving is possible based on the acquired location from estimation unit (24). An INDEPENDENT CLAIM is included for: an autonomous driving vehicle. Controller for transmitting control information necessary for autonomous driving vehicle even when a communication failure occurs. The Controller allows continuous autonomous driving even when a communication failure occurs. The reception condition from the autonomous driving vehicle may include interruption of the signal, reception of an urgent signal indicates abnormality from the autonomous driving vehicle, and reception of a signal generated in abnormal situations. When there is no information sufficient enough to estimate the vehicle speed of the autonomous driving vehicle, the control device of the control center generate a control instruction for simply shifting to a constant speed low enough to observe the speed limit, and may transmit the control instruction to the communication satellite second control information. When the autonomous driving vehicle fails to transmit the location information the controller switches to second control information received from the satellite communication device for temporary location identification. The drawing shows a block diagram of a Controller for transmitting control information necessary for autonomous driving vehicle.10Autonomous driving vehicle24Location from estimation unit50Wireless communication device60Input-output device |
Please summarize the input | Vehicle-to-everything communication-based lane change collision avoidance warningThe disclosure describes embodiments for modifying a whether an ego vehicle changes lanes to a target lane at a target time based on a payload of a Vehicle-to-Everything (V2X) message originated by a remote vehicle. In some embodiments, a method includes determining, based on the payload, whether the remote vehicle is changing lanes to the target lane at the target time. The method includes determining that the ego vehicle is changing lanes to the target lane at approximately the target time. The method includes estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time. The method includes modifying an operation of a vehicle component of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time.What is claimed is:
| 1. A method executed by an ego vehicle that includes an autonomous driving system, the method comprising:
determining, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged and a change in a steering wheel angle of the remote vehicle over time;
determining that the ego vehicle is changing lanes to the target lane at the target time;
estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and
modifying, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not steer into the target lane at the target time wherein this modifying is based on the determination that the ego vehicle is changing lanes to the target lane at the target time.
| 2. The method of claim 1, further comprising:
generating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and
inputting the warning data to the autonomous driving system;
wherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time.
| 3. The method of claim 2, wherein the warning data includes digital data that is operable, when inputted to an actuator of the autonomous driving system, to cause the actuator to provide a counter-steering force on the steering wheel.
| 4. The method of claim 1, wherein the V2X message is received by a V2X radio.
| 5. The method of claim 1, wherein the remote data further includes a location of the remote vehicle that is accurate to within 1.5 meters.
| 6. The method of claim 4, wherein the V2X radio includes a plurality of channels including a first reserved channel that is reserved for receiving the V2X message.
| 7. The method of claim 6, wherein the first reserved channel is reserved for a Basic Safety Message and a second reserved channel is reserved for receiving a Pedestrian Safety Message.
| 8. The method of claim 1, wherein the V2X message is a basic safety message and the sensor measurements include a speed, a heading, and the steering wheel angle of the remote vehicle.
| 9. A system included in an ego vehicle, the system comprising:
a processor communicatively coupled to an autonomous driving system, a Vehicle-to-Everything (V2X) radio, and a non-transitory memory, wherein the V2X radio is operable to receive a V2X message and the non-transitory memory stores computer code that is operable, when executed by the processor, to cause the processor to:
determine, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged or a change in a steering wheel angle of the remote vehicle over time;
determine that the ego vehicle is changing lanes to the target lane at the target time;
estimate that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and
modify, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time wherein this modification is based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time.
| 10. The system of claim 9, wherein the computer code is further operable to cause an audio system to generate an auditory warning.
| 11. The system of claim 9, wherein the computer code is further operable to cause the processor to:
generating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and
inputting the warning data to the autonomous driving system;
wherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at approximately the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time.
| 12. The system of claim 9, wherein the V2X message is received by a V2X radio.
| 13. The system of claim 11, wherein the warning data includes digital data that is operable, when inputted to an actuator of the autonomous driving system, to cause the actuator to provide a counter-steering force on the steering wheel.
| 14. The system of claim 12, wherein the V2X radio includes a plurality of channels including a first reserved channel that is reserved for receiving the V2X message.
| 15. The system of claim 14, wherein the first reserved channel is reserved for a Basic Safety Message and a second reserved channel is reserved for receiving a Pedestrian Safety Message.
| 16. The system of claim 9, wherein the V2X message is a basic safety message and the sensor measurements include a speed, a heading, and the steering wheel angle of the remote vehicle.
| 17. A computer program product comprising instructions that, when executed by a processor of an ego vehicle including an autonomous driving system, causes the processor to perform operations comprising:
determining, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged or a change in a steering wheel angle of the remote vehicle over time;
determining that the ego vehicle is changing lanes to the target lane at the target time;
estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and
modifying, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time wherein the modifying is based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time.
| 18. The computer program product of claim 17, wherein the operations further comprise:
generating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and
inputting the warning data to the autonomous driving system;
wherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time.
| 19. The computer program product of claim 17, wherein the operations further comprise providing a warning to a driver of the ego vehicle.
| 20. The computer program product of claim 19, wherein the warning is selected from a group that consists of: generating a warning message that is displayed on a display device of the ego vehicle; and generating a warning sound that is played over a speaker of the ego vehicle. | The method involves determining, based on a payload for a Vehicle-to-Everything (V2X) message originated by a remote vehicle, whether the remote vehicle is changing lanes to a target lane at a target time. It is determined that the ego vehicle is changing lanes to the target lane at approximately the target time. It is estimated that the ego vehicle and the remote vehicle will collide at the target lane at the target time. An operation of a vehicle component of the ego vehicle is modified so that the ego vehicle does not change lanes to the target lane at the target time. The Vehicle-to-Everything message is a dedicated Short-Range Communication message. INDEPENDENT CLAIMS are included for the following:a system included in an ego vehicle; anda computer program product comprising instructions that, when executed by a processor, causes the processor to perform operations. Method for providing a warning to a driver of an ego vehicle about a potential collision. When the automated system is enabled, driver attention is not required for the autonomous vehicle to operate safely and consistent with accepted norms. The drawing shows a schematic representation of an operating environment for a warning system. 100Operating environment105Network123Ego vehicle124Remote vehicle125Processor |
Please summarize the input | Generating real-time high-definition (HD) maps using wireless vehicle data of a remote vehicleThe disclosure includes embodiments for generating a real-time high-definition (HD) map for an ego vehicle using wireless vehicle data of a remote vehicle. In some embodiments, a method includes receiving a V2X wireless message which includes remote GPS data and remote road parameter data of the remote vehicle. The method includes retrieving ego GPS data of the ego vehicle. The method includes generating ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located. The method includes fusing the ego road parameter data and the remote road parameter data to form fused road parameter data which describes an improved estimate of the geometry of the road that is more accurate than the initial estimate. The method includes generating a real-time HD map based on the remote GPS data, the ego GPS data, and the fused road parameter data.What is claimed is:
| 1. A method for an ego vehicle including a Global Positioning System (GPS) unit, comprising:
receiving a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by the remote vehicle;
retrieving ego GPS data from the GPS unit of the ego vehicle;
generating ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located;
fusing the ego road parameter data and the remote road parameter data to form fused road parameter data; and
generating high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by:
generating one or more interpolated points based on the path history points; and
generating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points.
| 2. The method of claim 1, wherein the ego vehicle further includes a vehicle control system and the method further comprises:
inputting the HD map data to the vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and
executing the vehicle tracking technique based on the HD map data to track movement of the remote vehicle.
| 3. The method of claim 2, wherein the vehicle tracking technique is a vehicle-lane-change detection technique, and executing the vehicle tracking technique based on the HD map data comprises:
executing the vehicle-lane-change detection technique to detect a lane change of the remote vehicle based on the HD map data.
| 4. The method of claim 2, wherein inputting the HD map data to the vehicle control system includes modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data.
| 5. The method of claim 2, wherein the vehicle control system includes one of an Advanced Driver Assistance System (ADAS system) or an autonomous driving system.
| 6. The method of claim 1, wherein:
the remote road parameter data includes a lateral offset of the remote vehicle from a center of a reference lane; and
a lateral offset for each of the one or more interpolated points is estimated based on the lateral offset of the remote vehicle.
| 7. The method of claim 6, wherein the remote road parameter data further includes one or more of a relative heading of the remote vehicle, curvature of the reference lane, or a curvature change rate of the reference lane.
| 8. The method of claim 1, wherein the real-time HD map provides an estimate of a position of the remote vehicle that is accurate within plus or minus half a width of a lane on the road.
| 9. The method of claim 1, wherein the remote GPS data describes a geographical location of the remote vehicle, and the ego GPS data describes a geographical location of the ego vehicle.
| 10. The method of claim 1, wherein the fusing is achieved by Kalman filtering.
| 11. The method of claim 1, wherein the V2X wireless message is selected from a group that consists of: a Basic Safety Message; a Long-Term Evolution (LTE) message; a LTE-V2X message; a 5G-LTE message; or a millimeter wave message.
| 12. A system comprising:
a Global Positioning System (GPS) unit of an ego vehicle; and
an onboard vehicle computer system that is communicatively coupled to the GPS unit, the onboard vehicle computer system including a non-transitory memory storing computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system to:
receive a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by a remote vehicle;
retrieve ego GPS data from the GPS unit of the ego vehicle;
generate ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located;
fuse the ego road parameter data and the remote road parameter data to form fused road parameter data; and
generate high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by:
generating one or more interpolated points based on the path history points; and
generating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points.
| 13. The system of claim 12, wherein the system further includes a vehicle control system and the computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system further to:
input the HD map data to the vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and
execute the vehicle tracking technique based on the HD map data to track movement of the remote vehicle.
| 14. The system of claim 13, wherein the computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system further to:
input the HD map data to the vehicle control system for modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data.
| 15. The system of claim 13, wherein the vehicle control system includes one of an Advanced Driver Assistance System (ADAS system) or an autonomous driving system.
| 16. The system of claim 12, wherein:
the remote road parameter data includes a lateral offset of the remote vehicle from a center of a reference lane; and
a lateral offset for each of the one or more interpolated points is estimated based on the lateral offset of the remote vehicle.
| 17. The system of claim 16, wherein the remote road parameter data further includes one or more of a relative heading of the remote vehicle, curvature of the reference lanes a curvature change rate of the reference lane.
| 18. A computer program product comprising a non-transitory memory of an onboard vehicle computer system of an ego vehicle storing computer-executable code that, when executed by a processor, causes the processor to:
receive a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by the remote vehicle;
retrieve ego GPS data from a GPS unit of the ego vehicle;
generate ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located;
fuse the ego road parameter data and the remote road parameter data to form fused road parameter data; and
generate high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by:
generating one or more interpolated points based on the path history points; and
generating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points.
| 19. The computer program product of claim 18, wherein the computer-executable code that, when executed by the processor, causes the processor further to:
input the HD map data to a vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and
execute the vehicle tracking technique based on the HD map data to track movement of the remote vehicle.
| 20. The computer program product of claim 18, wherein the computer-executable code that, when executed by the processor, causes the processor further to:
input the HD map data to a vehicle control system for modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data. | The method involves receiving a vehicle-to-anything wireless message from a network (125), where the vehicle-to-anything wireless message includes remote Global positioning system data and remote road parameter data generated by a remote vehicle (124). The ego global positioning system data is retrieved from the global positioning system unit of an ego vehicle (123). The ego road parameter data is generated, which describes an initial estimate of a geometry of a road, on which the ego vehicle is located. INDEPENDENT CLAIMS are included for the following:a system for generating a real-time high-definition map for an ego vehicle having a global positioning system unit; anda computer program product. Method for generating a real-time high-definition map for an ego vehicle having a global positioning system unit. The accuracy of real-time high-definition maps is ensured. The drawing shows a schematic representation of a process for generating a real-time high-definition map. 123Ego vehicle124Remote vehicle125Network127Memory145Communication unit |
Please summarize the input | Theft deterrent system for connected vehicles based on wireless messagesThe disclosure includes embodiments for theft deterrent for a connected vehicle using Basic Safety Message (BSM)-based Vehicle-to-Anything (V2X) communication. In some embodiments, a method includes receiving, by a V2X radio, a wireless message that is transmitted by a first connected vehicle whose ignition is disengaged. In some embodiments, the method includes determining, based on a payload of the wireless message, that the first connected vehicle is being subjected to criminal activity. In some embodiments, the method includes taking a remedial action responsive to determining that the first connected vehicle is being subjected to criminal activity. In some embodiments, the remedial action is operable to deter the occurrence of the criminal activity (i.e., stop the first connected vehicle from being stolen) or gather visual evidence of the criminal activity.What is claimed is:
| 1. A method comprising:
transmitting, by a first Vehicle-to-Anything (V2X) radio of a first connected vehicle to a second connected vehicle, a first wireless message, wherein the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage and wherein the second transmission rate is less frequent than the first transmission rate;
determining, based on a payload of the first wireless message, whether the first connected vehicle is being subjected to criminal activity;
responsive to determining that the first connected vehicle is being subjected to criminal activity, taking a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity;
and transmitting, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is going to disengage, wherein the disengagement notification is trigged by a low battery level.
| 2. The method of claim 1, wherein the first transmission rate is once every 0.10 seconds and the second transmission rate is once every three to five seconds.
| 3. The method of claim 1, wherein the disengagement notification is part of a basic safety message.
| 4. The method of claim 1, wherein the first connected vehicle is an autonomous vehicle.
| 5. The method of claim 1, wherein the warning notification includes one or more of honking a horn or flashing headlamps.
| 6. The method of claim 1, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their alarm systems and simultaneously provide the warning notification that the first connected vehicle is being subjected to criminal activity.
| 7. The method of claim 1, wherein the first connected vehicle is parked and an ignition of the first connected vehicle is disengaged.
| 8. The method of claim 1, wherein the remedial action further includes activating one or more onboard external cameras of the first connected vehicle so that one or more of images and video of the criminal activity are recorded.
| 9. The method of claim 1, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their onboard external cameras and simultaneously record one or more of images and video of the criminal activity from various points of view.
| 10. The method of claim 8, wherein the one or more of the images and the video are wirelessly transmitted to a third connected device that is operated by a law enforcement agency.
| 11. The method of claim 1, wherein the remedial action further includes displaying a warning message indicating that the first connected vehicle is being subjected to criminal activity or one or more images of the criminal activity.
| 12. A system comprising:
a processor communicatively coupled to a first Vehicle-to-Anything (V2X) radio of a first connected vehicle and non-transitory memory, wherein the first V2X radio is operable to transmit a first wireless message to a second connected vehicle, the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage, the second transmission rate is less frequent than the first transmission rate, and the non-transitory memory stores computer code that is operable, when executed by the processor, to cause the processor to:
determine, based on a payload of the first wireless message, that the first connected vehicle is being subjected to criminal activity;
responsive to determining that the first connected vehicle is being subjected to criminal activity, take a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity;
and transmit, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is IP-A-3443 going to disengage, wherein the disengagement notification is trigged by a low battery level.
| 13. The system of claim 12, wherein the first connected vehicle is an autonomous vehicle.
| 14. The system of claim 12, wherein the first transmission rate is once every 0.10 seconds and the second transmission rate is to once every three to five seconds.
| 15. The system of claim 12, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their alarm systems and simultaneously provide the warning notification that the first connected vehicle is being subjected to criminal activity.
| 16. The system of claim 13, wherein the first connected vehicle is parked and an ignition of the first connected vehicle is disengaged.
| 17. The system of claim 12, wherein the remedial action further includes activating one or more onboard external cameras of the first connected vehicle so that one or more of images and video of the criminal activity are recorded.
| 18. The system of claim 12, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their onboard external cameras and simultaneously record one or more of images and video of the criminal activity from various points of view.
| 19. The system of claim 18, wherein the remedial action further includes displaying a warning message indicating that the first connected vehicle is being subjected to criminal activity or one or more images of the criminal activity.
| 20. A non-transitory computer program product comprising instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising:
instructing a first Vehicle-to-Anything (V2X) radio of a first connected vehicle to transmit a first wireless message to a second connected vehicle, wherein the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage and wherein the second transmission rate is less frequent than the first transmission rate;
determining, based on a payload of the first wireless message, that the first connected vehicle is being subjected to criminal activity;
responsive to determining that the first connected vehicle is being subjected to criminal activity, taking a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity;
and transmitting, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is going to disengage, wherein the disengagement notification is trigged by a low battery level. | The method involves receiving a wireless message that is transmitted by a first connected vehicle whose ignition is disengaged by a V2X radio (144). The first connected vehicle is subjected to criminal activity is determined based on a payload of the wireless message. A remedial action is taken responsive to determine that the first connected vehicle is subjected to criminal activity. The wireless message is a dedicated short-range communication (DSRC) message and the payload is compliant with the DSRC standard. The DSRC message is not one of the followingwireless fidelity (WiFi) message,third generation (3G) message,fourth generation (4G) message, long-term evolution (LTE) message, millimeter wave communication message,Bluetooth message and a satellite communication. INDEPENDENT CLAIMS are included for the following:Theft deterrent system for connected vehicle; anda computer program product for performing theft deterrent for connected vehicle. Method for performing theft deterrent for connected vehicle using basic safety message (BSM)-based vehicle-to-anything (V2X) communication. The driver attention is not required for the autonomous vehicle to operate safely and consistent with accepted norms when the automated system is enabled. A global positioning system (GPS) unit provides positional information that positions the conventional GPS unit with an accuracy of plus or minus 10 meters of the actual position of the conventional GPS unit. The deterrent system provides reduced vehicle theft, improves the performance of a car alarm system and reduces the cost of manufacturing car alarm system by connecting multiple vehicles through BSM-based V2X communication. A vehicle theft or suspicious circumstance is detected by the deterrent system. The drawing shows a block diagram of a deterrent system for connected vehicle. 105Network120Bus125Processor127Memory144V2X radio |
Please summarize the input | Target-lane relationship recognition apparatusA target-lane relationship recognition apparatus mounted on a vehicle includes a sensor that detects a situation around the vehicle, a memory device in which a map data indicating a boundary position of a lane on a map is stored, and a processing device. The processing device is configured to: (a) acquire, based on the sensor detection result, target information regarding a moving target and a stationary target around the vehicle; (b) acquire, based on the map data and position-orientation of the vehicle, lane geometry information indicating a lane geometry around the vehicle; (c) adjust the lane geometry to generate an adjusted lane geometry satisfying a condition that the moving target is located within a lane and the stationary target is located outside of any lane; and (d) generate target-lane relationship information indicating a positional relationship between the moving target and the adjusted lane geometry.What is claimed is:
| 1. A target-lane relationship recognition apparatus mounted on a vehicle, comprising:
a sensor configured to detect a situation around the vehicle;
a memory device in which a map data indicating a boundary position of a lane on a map is stored; and
a processor configured to perform:
target information acquisition processing that acquires, based on a result of detection by the sensor, target information regarding a moving target and a stationary target around the vehicle;
lane geometry acquisition processing that acquires, based on the map data and a position and an orientation of the vehicle, lane geometry information indicating a lane geometry around the vehicle;
lane geometry adjustment processing that adjusts the lane geometry to generate an adjusted lane geometry satisfying a condition that the moving target is located within a lane and the stationary target is located outside of any lane; and
information generation processing that generates target-lane relationship information indicating a positional relationship between the moving target and the adjusted lane geometry, wherein
the lane geometry is represented by a group of plural elements, and
the processor performs the lane geometry adjustment processing such that the adjusted lane geometry maintains a relative positional relationship between the plural elements.
| 2. The target-lane relationship recognition apparatus according to claim 1, wherein
the target information includes a position of a representative point of the moving target, and
the condition includes the representative point being located within the lane within which the moving target is located.
| 3. The target-lane relationship recognition apparatus according to claim 1, wherein
the target information includes respective positions of a plurality of detected points defining a size of the moving target, and
the condition includes all of the plurality of detected points being located within the lane within which the moving target is located.
| 4. The target-lane relationship recognition apparatus according to claim 1, wherein
the target information includes a trajectory of the moving target, and
the condition includes the trajectory being located within the lane within which the moving target is located.
| 5. The target-lane relationship recognition apparatus according to claim 1, wherein
the target information includes a position and a velocity of the moving target, and wherein
in the lane geometry adjustment processing, the processor predicts a future position of the moving target based on the position and the velocity of the moving target, and the condition further includes that the future position of the moving target remains in the same lane as a current position of the moving target.
| 6. The target-lane relationship recognition apparatus according to claim 1, wherein
the target information includes a position and a velocity of the moving target, and wherein
in the lane geometry adjustment processing, the processor calculates a tangent line of a lane boundary closest to the moving target, and adds an angle between the tangent line and a vector of the velocity of the moving target being equal to or less than a threshold value to the condition.
| 7. The target-lane relationship recognition apparatus according to claim 1, wherein
in the target information acquisition processing, the processor further acquires target information regarding a lane changing target that is a second moving target in a middle of lane changing, and wherein
in the lane geometry adjustment processing, the processor adds the lane changing target overlapping a lane boundary to the condition.
| 8. The target-lane relationship recognition apparatus according to claim 1, further comprising a communication device configured to acquire, through a vehicle-to-vehicle communication or a vehicle-to-infrastructure communication, another vehicle's lane information indicating a travel lane of another vehicle around the vehicle, wherein
in the lane geometry adjustment processing, the condition further includes that the moving target is located within the travel lane of the another vehicle.
| 9. The target-lane relationship recognition apparatus according to claim 1, wherein
in the lane geometry adjustment processing, the processor recognizes a white line position based on the result of detection by the sensor, and adds a distance between the white line position and a lane boundary being equal to or less than a threshold value to the condition.
| 10. The target-lane relationship recognition apparatus according to claim 1, wherein
the processor further performs a driving assist control or an autonomous driving control by using the target-lane relationship information. | The apparatus has a sensor detecting a situation around a vehicle (1). A memory device stores map data indicating a boundary position of a lane (L1). A processing device performs lane geometry adjustment processing that adjusts a lane geometry (LG) to generate an adjusted lane geometry satisfying a condition that a moving target (TM1) is located within the lane and a stationary target (TS1) is located outside of the lane and information generation processing that generates target-lane relationship information indicating a positional relationship between the moving target and the lane geometry. Apparatus for recognizing a positional relationship between a target and a lane around a vehicle. The apparatus utilizes the adjusted lane geometry consistent with an actual condition so as to accurately recognize the positional relationship between the surrounding target and the surrounding lane, thus accurately recognizing the preceding vehicle. The drawing shows a schematic view of an apparatus for recognizing a positional relationship between a target and a lane around a vehicle. L1, L2LanesLGLane geometryTM1, TM2Moving targetsTS1, TS2Stationary targets1Vehicle |
Please summarize the input | AUTOMATED VALET PARKING SYSTEM, CONTROL METHOD OF AUTOMATED VALET PARKING SYSTEM, AND AUTONOMOUS DRIVING VEHICLEAn automatic parking system for performing automatic parking of an autonomously driving vehicle by giving an instruction regarding a target vehicle speed and a target route to the autonomously driving vehicle in a parking lot, comprising: an autonomously driving vehicle and a communication capable vehicle that is not capable of autonomous driving but is capable of vehicle-to-vehicle communication; A vehicle information acquisition unit that acquires location information in the parking lot of the target vehicle, and a condition determination unit that determines whether the target vehicle satisfies the vehicle-to-vehicle communication driving condition based on the location information in the parking lot of the target vehicle acquired by the vehicle information acquisition unit and a vehicle-to-vehicle communication driving instructing unit for causing the target vehicle determined by the condition determining unit to satisfy the vehicle-to-vehicle communication driving condition to perform vehicle-to-vehicle communication traveling by vehicle-to-vehicle communication, wherein in vehicle-to-vehicle communication driving, automatic parking Prioritize vehicle speed adjustment of vehicle access suppression by vehicle-to-vehicle communication over the target vehicle speed for self-driving vehicles.|1. An automatic parking system that performs automatic parking of the autonomous vehicle by giving instructions regarding target vehicle speed and target route to the autonomous vehicle in a parking lot, wherein the vehicle information acquisition unit acquires location information of the autonomous vehicle in the parking lot. and a condition determination unit that determines whether the autonomous vehicle satisfies preset vehicle-to-vehicle communication driving conditions, based on the location information of the autonomous vehicle in the parking lot acquired by the vehicle information acquisition unit, and the condition determination unit. and an inter-vehicle communication driving instruction unit that causes the autonomous vehicle, which is determined to have satisfied the inter-vehicle communication driving conditions, to perform inter-vehicle communication driving by inter-vehicle communication, wherein in the inter-vehicle communication driving, the automatic parking The vehicle speed adjustment for vehicle approach suppression by vehicle-to-vehicle communication is given priority over the target vehicle speed for the autonomous driving vehicle, and the vehicle information acquisition unit, Acquires location information within the parking lot of a target vehicle including the autonomous driving vehicle and a communication capable vehicle that is incapable of autonomous driving but is capable of vehicle-to-vehicle communication, and the condition determination unit determines the location of the target vehicle within the parking lot acquired by the vehicle information acquisition unit. Based on the information, it is determined whether the target vehicle satisfies a preset vehicle-to-vehicle communication driving condition, and the vehicle-to-vehicle communication driving instruction unit determines that the vehicle-to-vehicle communication driving condition is determined by the condition determination unit. The target vehicle is caused to perform the vehicle-to-vehicle communication drive by vehicle-to-vehicle communication, and the condition determination unit determines that, when the number of the target vehicles in the same lane in the parking lot is equal to or greater than the lane number threshold, the condition determination unit determines that the number of target vehicles in the same lane is greater than or equal to the lane number threshold. An automatic parking system that determines that the plurality of target vehicles running satisfies the vehicle-to-vehicle communication driving conditions.
| 2. delete
| 3. delete
| 4. The method of claim 1, wherein the condition determination unit is configured to select, when there are a plurality of target vehicles whose inter-vehicle distances are less than the inter-vehicle distance threshold, the plurality of target vehicles whose inter-vehicle distances are less than the inter-vehicle distance threshold. An automatic parking system that determines that vehicle-to-vehicle communication driving conditions have been met.
| 5. The method according to claim 1 or 4, wherein the vehicle information acquisition unit acquires location information within the parking lot of a general vehicle that is unable to drive autonomously and is unable to communicate between vehicles, and the condition determination unit determines that the general vehicle is in the parking lot. An automatic parking system where, when there are a plurality of target vehicles located on a driving lane, it is determined that the plurality of target vehicles on the lane satisfy the vehicle-to-vehicle communication driving condition.
| 6. The method according to claim 1 or 4, wherein the vehicle information acquisition unit acquires location information within the parking lot of a general vehicle incapable of autonomous driving and vehicle-to-vehicle communication, and the condition determination unit determines the location of the vehicle within a preset area in the parking lot. An automatic parking system that determines that, when the general vehicle enters, the plurality of target vehicles in the setting area satisfy the vehicle-to-vehicle communication driving conditions.
| 7. The method according to claim 1 or 4, wherein, when the target vehicle that performs vehicle-to-vehicle communication travel exists, the target vehicle is based on position information in the parking lot of a plurality of target vehicles acquired by the vehicle information acquisition unit. It further includes a release condition determination unit that determines whether the preset release condition is satisfied, and the vehicle-to-vehicle communication driving instruction unit determines that the vehicle-to-vehicle connection of the target vehicle has been determined to have satisfied the release condition by the release condition decision unit. An automatic parking system that terminates communication driving.
| 8. A control method of an automatic parking system that executes automatic parking of an autonomous vehicle by giving instructions regarding target vehicle speed and target route to the autonomous vehicle in a parking lot, wherein location information of the autonomous vehicle in the parking lot is acquired. an information acquisition step, and a condition determination step for determining whether the autonomous vehicle satisfies preset vehicle-to-vehicle communication driving conditions based on location information of the autonomous vehicle within the parking lot acquired in the vehicle information acquisition step; and an inter-vehicle communication driving instruction step that causes the autonomous vehicle, which is determined to have satisfied the inter-vehicle communication driving conditions in the condition determination step, to perform inter-vehicle communication driving by inter-vehicle communication, and in the inter-vehicle communication driving,, the vehicle speed adjustment for vehicle approach suppression by the vehicle-to-vehicle communication is given priority to the autonomous driving vehicle over the target vehicle speed for the automatic parking, and in the vehicle information acquisition step, Position information in the parking lot of target vehicles including the autonomous driving vehicle and a communication capable vehicle that is incapable of autonomous driving but is capable of vehicle-to-vehicle communication is acquired, and in the condition determination step, the parking lot of the target vehicle acquired in the vehicle information acquisition step is acquired. Based on the position information in the vehicle, it is determined whether the target vehicle satisfies a preset inter-vehicle communication driving condition, and in the inter-vehicle communication driving instruction step, it is determined that the inter-vehicle communication driving condition is satisfied in the condition determination step. The target vehicle is made to perform the vehicle-to-vehicle communication drive by vehicle-to-vehicle communication, and in the condition determination step, when the number of target vehicles in the same lane in the parking lot is equal to or greater than the lane number threshold, A control method of an automatic parking system, wherein it is determined that the plurality of target vehicles traveling in the same lane satisfy the vehicle-to-vehicle communication driving conditions.
| 9. An autonomous driving vehicle that performs automatic parking in a parking lot based on instructions regarding a target vehicle speed and a target route from an automatic parking system in the parking lot, comprising: a own vehicle location recognition unit that recognizes a location in the parking lot; and the autonomous driving vehicle. A driving state recognition unit that recognizes the driving state of the autonomous vehicle based on an internal sensor of the vehicle, the driving state of the autonomous vehicle recognized by the driving state recognition unit, and the target vehicle speed indicated by the automatic parking system, or Based on the comparison result of the target route, a vehicle-side condition determination unit for determining whether the autonomous vehicle satisfies a preset vehicle-side vehicle-to-vehicle communication driving condition, and the vehicle-side condition determination unit determines whether the vehicle-side vehicle-to-vehicle communication driving condition is and an inter-vehicle communication driving executing unit that performs inter-vehicle communication driving of the autonomous driving vehicle by inter-vehicle communication when it is determined that the inter-vehicle communication driving conditions are met, and in the inter-vehicle communication driving, The vehicle speed adjustment for vehicle approach suppression by the vehicle-to-vehicle communication is given priority over the target vehicle speed for automatic parking, and the vehicle-side condition determination unit compares the target vehicle speed indicated by the automatic parking system according to the location in the parking lot. Then, when the vehicle speed of the autonomous vehicle at the location is greater than the vehicle speed determination threshold, it is determined that the vehicle-side vehicle-to-vehicle communication driving condition is satisfied.
| 10. delete | The system (1) has a vehicle information acquisition unit for acquiring positional information of an autonomous driving vehicle (23) in a parking place. A condition determination unit determines whether the autonomous driving vehicle satisfies a preset inter-vehicle communication traveling condition based on the positional information of the autonomous driving vehicle in the parking place acquired by the vehicle information acquisition unit. An inter-vehicle communication traveling instruction unit causes the autonomous driving vehicle determined to satisfy the inter-vehicle communication traveling condition by the condition determination unit to perform inter-vehicle communication traveling by inter-vehicle communication, where the autonomous driving vehicle preferentially performs vehicle speed adjustment for vehicle approach suppression by the inter-vehicle communication over target vehicle speed for an automated valet parking in the inter-vehicle communication traveling. INDEPENDENT CLAIMS are included for:(a) Method for controlling an automated valet parking system;(b) Autonomous driving vehicle. System for executing automated valet parking of an autonomous driving vehicle (claimed) in a parking place by issuing an instruction related to target vehicle speed and a target route to the autonomous driving vehicle. The system prevents inter-vehicle distance between the autonomous driving vehicle and the target vehicle from being close in the parking place. The system comprises a vehicle-side condition determination unit that determines whether the autonomous vehicle satisfies the inter- vehicle communication traveling condition based on a comparison result between the traveling state of the vehicle and target vehicle speed or the target route as instructed from the automated valet parking system, so that vehicle speed adjustment for vehicle approach suppression by the inter-vehicle communication is preferentially performed over the vehicle vehicle speed for the automated Valet parking. The drawing shows a schematic view of an automated valet parking system.1System for executing automated valet parking of autonomous driving vehicle 3Parking place sensor 4Parking place map database 23Autonomous driving vehicle NNetwork |
Please summarize the input | Enhanced vehicle-to-everything (V2X) communications using a satellite/airborne interfaceA system, method and apparatus for mobile communications including sidelink transmissions is provided. A user equipment (UE) maintains a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node. The UE switches between interfaces according to a current connectivity state and based on signal attributes associated with individual interfaces.The invention claimed is:
| 1. A method of wireless communications, comprising: maintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; switching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of:
a first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface;
a second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and
a third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface;
wherein:
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with at least one of the second interface and the third interface when the first UE is determined to be in the second connectivity state.
| 2. The method of claim 1, wherein:
the first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.
| 3. The method of claim 1, wherein a signal strength attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.
| 4. The method of claim 1, wherein a signal quality attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.
| 5. The method of claim 1 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.
| 6. The method of claim 1, wherein the radio access network (RAN) infrastructure node is a base station.
| 7. The method of claim 1, wherein the first interface is a Uu interface.
| 8. The method of claim 7, wherein the Uu interface is based on a first plurality of protocol terminations between the first user equipment (UE) and the radio access network (RAN) infrastructure node.
| 9. The method of claim 1, wherein the second interface is a PC-5 interface.
| 10. The method of claim 9, wherein the PC-5 interface is based on a second plurality of protocol terminations between the first user equipment (UE) and the second UE.
| 11. The method of claim 1, wherein the third interface is a satellite/airborne interface.
| 12. The method of claim 11, wherein the satellite/airborne interface is based on a third plurality of protocol terminations between the first user equipment (UE) and the satellite node or the airborne node.
| 13. The method of claim 1, wherein the switching between the first connectivity state to the second connectivity state is based on:
comparing the first signal attribute with a first threshold;
comparing the second signal attribute with a second threshold; and
comparing the third signal attribute with a third threshold.
| 14. The method of claim 13, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.
| 15. The method of claim 13, wherein the first threshold, the second threshold and the third threshold are pre-determined.
| 16. The method of claim 13 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first threshold, the second threshold and the third threshold.
| 17. The method of claim 16, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.
| 18. The method of claim 17, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.
| 19. The method of claim 1, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.
| 20. The method of claim 19, wherein:
the first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets.
| 21. The method of claim 19 further comprising, determining by the first user equipment (UE), the quality of service (QoS) requirements of the data packets based on priority tags of the data packets.
| 22. The method of claim 21, wherein the priority tags of the data packets indicate that the data packets are associated with one or more of:
safety related information;
firefighters or government agencies;
autonomous driving; and
traffic information.
| 23. The method of claim 1 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).
| 24. The method of claim 1, wherein:
the maintaining the first interface, the second interface and the third interface comprises registering by the first user equipment (UE) with a Core Network node, wherein a context of the first UE associated with the first interface, the second interface and the third interface is established in the Core Network.
| 25. The method of claim 1, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of:
a first duration for which the first signal attribute maintains at least one of a first signal strength or signal quality requirement;
a second duration for which the second signal attributes maintains at least one of a second signal strength or signal quality requirement; and
a third duration for which the third signal attribute maintains at least one of a third signal strength or signal quality requirement.
| 26. The method of claim 25, wherein
the first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
wherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold.
| 27. The method of claim 26 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first duration threshold, the second duration threshold and the third duration threshold.
| 28. The method of claim 26, wherein the first duration threshold, the second duration threshold and the third duration threshold are pre-determined values.
| 29. A method of wireless communications, comprising:
maintaining, by a first user equipment (UE), a plurality of interfaces including between the first UE and a set of network components associated with wireless communication, wherein the plurality of interfaces facilitate communication of data packets associated with Vehicle to Anything (V2X) services and wherein the plurality of interfaces include a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node;
managing at least one of receipt or transmission of V2X data packets based on a determined connectivity state of the of the first UE, wherein managing the at least one of receipt or transmission of V2X data packets is based on switching, by the first UE, between a first connectivity state and a second connectivity state based on signal attributes associated with at least one of the first, second or third interfaces;
wherein:
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with at least one of the second interface and the third interface when the first UE is determined to be in the second connectivity state.
| 30. The method of claim 29, wherein:
the first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.
| 31. The method of claim 29, wherein a signal strength attribute associated with at least one of a first signal attribute, a second signal attribute and a third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.
| 32. The method of claim 29, wherein a signal quality attribute associated with at least one of a first signal attribute, a second signal attribute and a third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.
| 33. The method of claim 32 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.
| 34. The method of claim 29, wherein the switching between the first connectivity state to the second connectivity state is based on:
comparing a first signal attribute with a first threshold;
comparing a second signal attribute with a second threshold; and
comparing a third signal attribute with a third threshold.
| 35. The method of claim 34, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.
| 36. The method of claim 29, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.
| 37. The method of claim 29, wherein:
the first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets.
| 38. The method of claim 29, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of:
a first duration for which a first signal attribute maintains at least one of a first signal strength or signal quality requirement;
a second duration for which a second signal attributes maintains at least one of a second signal strength or signal quality requirement; and
a third duration for which a third signal attribute maintains at least one of a third signal strength or signal quality requirement.
| 39. The method of claim 38, wherein
the first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
wherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold. | The method involves maintaining a set of interfaces including an interface between a user equipment (UE) and a radio access network (RAN) infrastructure node by the UE. Another interface is provided between the UE and another UE, and a third interface is connected between the former UE and a satellite node or an airborne node. The UE is switched between two connectivity states based on a signal attribute associated with the former interface, where the UE communicates in accordance with one of the interfaces when the UE is determined to be in the connectivity states, and the latter interface is a PC-5 interface. Method for performing wireless communication between UE and nodes in a RAN such as evolved universal terrestrial RAN (EUTRAN) , and universal terrestrial RAN (UTRAN) , using interfaces e.g. Uu interface with a RAN node, PC-5 interface with a pedestrian, PC-5 interface with a vehicle and PC-5 interface with an infrastructure node. Uses include but are not limited to smartphones, tablets, laptops, computers, wireless transmission and/or reception units in a vehicle, V2X or Vehicle to Vehicle (V2V) devices, wireless sensors, and internet-of-things (IoT) devices. The method enables utilizing the computing devices to utilize the wireless communication network to facilitate interactions with other devices that can access the network or to facilitate interaction, through the network, with devices utilizing other communication networks in an efficient manner. The drawing shows a block diagram of connectivity states and state transitions. |
Please summarize the input | Information processing apparatus, information processing method, and recording mediumAn information processing apparatus includes: a memory configured to store information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving; and a processor configured to transmit a summon command to a first mobile body existing within a predetermined range, the summon command summoning the first mobile body to a predetermined assembly location, and cause a predetermined number of first mobile bodies assembled at the predetermined assembly location to carry one first main body unit that is larger or heavier than a main body unit that can be carried by one first mobile body, and to cause the predetermined number of first mobile bodies to move to a predetermined destination.What is claimed is:
| 1. An information processing apparatus comprising:
a memory configured to store information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving; and
a processor configured to:
select, from a plurality of first mobile bodies existing within a predetermined range, a needed number of the first mobile bodies for transport of one first main body unit that is larger or heavier than a main body unit that can be carried by only one of the first mobile bodies;
transmit, to user terminals of users who own the selected first mobile bodies, use permission requests for the selected first mobile bodies;
determine whether or not a number of acceptance responses, which are from the user terminals and are for using the selected first mobile bodies, reaches the needed number;
in a case where the number of acceptance responses does not reach the needed number, (i) select additionally, from the plurality of first mobile bodies existing within the predetermined range, a number of next first mobile bodies corresponding to a shortage of the first mobile bodies and (ii) transmit a use permission request to a user terminal of a user who owns one of the next first mobile bodies additionally selected;
in a case where the number of acceptance responses reaches the needed number, transmit a summon command to the first mobile bodies for which the acceptance responses from the user terminals were transmitted, those first mobile bodies being summoned first mobile bodies, and the summon command summoning the summoned first mobile bodies to a predetermined assembly location;
cause the summoned first mobile bodies assembled at the predetermined assembly location to carry the one first main body unit; and
cause the summoned first mobile bodies to move to a predetermined destination, wherein
each of the plurality of first mobile bodies is owned by a predetermined user as a private-use vehicle.
| 2. The information processing apparatus according to claim 1, wherein, when the summoned first mobile bodies carrying the one first main body unit are no longer being used, the processor transmits a dismiss command to the summoned first mobile bodies, the dismiss command instructing movement of the summoned first mobile bodies to respective return locations.
| 3. The information processing apparatus according to claim 1, wherein
the information about the mobile bodies includes information indicating a state of at least one of the mobile bodies, and
the first mobile bodies to which the processor transmits the summon command are each in an idle state.
| 4. The information processing apparatus according to claim 1, wherein the summoned first mobile bodies caused by the processor to assemble at the predetermined assembly location are each in an idle state at a time of reception of the summon command.
| 5. The information processing apparatus according to claim 1, wherein the processor causes the summoned first mobile bodies to move and carry the one first main body unit by convoy-traveling.
| 6. The information processing apparatus according to claim 5, wherein the processor controls each of the summoned first mobile bodies in relation to the convoy-traveling.
| 7. The information processing apparatus according to claim 5, wherein
the processor is configured to select from the summoned first mobile bodies, according to a predetermined condition, a mobile body to be a leader of the convoy-traveling at a time of moving and carrying the one first main body unit, and
the summoned first mobile bodies perform vehicle-to-vehicle communication with one another and perform the convoy-traveling according to an instruction from the mobile body selected as the leader.
| 8. The information processing apparatus according to claim 5, wherein the processor selects a first mobile body that is to be a leader from among the summoned first mobile bodies, and the processor causes the summoned first mobile bodies to perform the convoy-traveling.
| 9. The information processing apparatus according to claim 1, wherein the processor determines the needed number of the first mobile bodies based on the information about the mobile bodies.
| 10. An information processing method comprising:
storing, in a memory, information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving;
selecting, from a plurality of first mobile bodies existing within a predetermined range, a needed number of the first mobile bodies for transport of one first main body unit that is larger or heavier than a main body unit that can be carried by only one of the first mobile bodies;
transmitting, to user terminals of users who own the selected first mobile bodies, use permission requests for the selected first mobile bodies;
determining whether or not a number of acceptance responses, which are from the user terminals and are for using the selected first mobile bodies, reaches the needed number;
in a case where the number of acceptance responses does not reach the needed number, (i) selecting additionally, from the plurality of first mobile bodies existing within the predetermined range, a number of next first mobile bodies corresponding to a shortage of the first mobile bodies and (ii) transmitting a use permission request to a user terminal of a user who owns one of the next first mobile bodies additionally selected;
in a case where the number of acceptance responses reaches the needed number, transmitting a summon command to the first mobile bodies for which the acceptance response from the user terminals were transmitted, those first mobile bodies being summoned first mobile bodies, and the summon command summoning the summoned first mobile bodies to a predetermined assembly location;
causing the summoned first mobile bodies assembled at the predetermined assembly location to carry the one first main body unit; and
causing the summoned first mobile bodies to move to a predetermined destination, wherein
each of the plurality of first mobile bodies is owned by a predetermined user as a private-use vehicle.
| 11. The information processing method according to claim 10, comprising transmitting, when the summoned first mobile bodies carrying the one first main body unit are no longer being used, a dismiss command to the summoned first mobile bodies, the dismiss command instructing movement of the summoned first mobile bodies to respective return locations.
| 12. The information processing method according to claim 10, wherein the information about the mobile bodies includes information indicating a state of the at least one of the mobile bodies, and the first mobile bodies to which the summon command is transmitted are each in an idle state.
| 13. The information processing method according to claim 10, wherein the summoned first mobile bodies move and carry the one first main body unit by convoy-traveling.
| 14. The information processing method according to claim 13, wherein each of the summoned first mobile bodies is controlled in relation to the convoy-traveling.
| 15. The information processing method according to claim 13, wherein
a mobile body, which is to be a leader of the convoy-traveling at a time of moving and carrying the one first main body unit, is selected from the summoned first mobile bodies according to a predetermined condition, and
the summoned first mobile bodies perform vehicle-to-vehicle communication with one another and perform the convoy-traveling according to an instruction from the mobile body selected as the leader.
| 16. The information processing method according to claim 13, wherein the summoned first mobile bodies perform vehicle-to-vehicle communication with one another, select from the summoned first mobile bodies a first mobile body that is to be a leader, and perform the convoy-traveling.
| 17. The information processing method according to claim 10, wherein the needed number of the first mobile bodies is determined based on the information about the mobile bodies.
| 18. A non-transitory computer-readable medium storing an information processing program for causing a user terminal to:
receive a use permission request for a mobile body that is owned by an owner of the user terminal as a private-use vehicle, the mobile body forming a vehicle by being coupled with a main body unit and being capable of autonomous driving;
transmit a response to the use permission request; and
receive a summon command summoning the mobile body to a predetermined assembly location in a predetermined case, wherein
the use permission request requests permission to use the mobile body as a first mobile body, in a case where one first main body unit that is larger or heavier than a main body unit that can be carried by only one first mobile body is to be carried and moved by a needed number of first mobile bodies existing within a predetermined range, and
the predetermined case is
a case where the response is an acceptance response for using the mobile body as the first mobile body, and
a case where a number of acceptance responses are determined to reach the needed number for transport of the one first main body unit. | The apparatus has a memory which is configured to store information about mobile bodies that are configured to form a vehicle by coupled with a main frame unit (300) and is configured to provide autonomous driving. A processor is configured to transmit a summon command to a first mobile frame existed within a predetermined range. The summon command is configured to summon the first mobile frame to a predetermined assembly location. A predetermined number of first mobile frames assembled at the predetermined assembly location is caused to carry one first main frame unit that is larger and heavier than a main frame unit that is carried by one first mobile frame, and the predetermined number of the first mobile frames is caused to move to a predetermined destination. An INDEPENDENT CLAIMS is included for the following:(a)an information processing method;(b)a non-transitory computer-readable medium storing program for processing information. Information processing apparatus. The information processing apparatus is not needed to perform control regarding the convoy-traveling for each of the predetermined number of first mobile bodies, and thus processing load on the information processing apparatus is reduced. The traveling unit not used without permission from the owner user, and the owner user is prevented from interrupted from using the traveling unit. The number of traveling units is enabled to transport one main frame unit to be reduced to smaller number. The drawing shows a schematic view illustrating the system configuration of traveling-unit summoning system.1Center server 4A,4BUser terminals 100Traveling-unit summoning system 200A,200B Traveling units 300Main frame unit |
Please summarize the input | Road sign recognition for connected vehiclesThe disclosure includes embodiments for providing road sign recognition for connected vehicles. In some embodiments, a method includes determining that a first set of first vehicles have assessed by that a content for a road sign is a first value. In some embodiments, the method includes determining that the content for the road sign is a second value assessed by a second set of second vehicles based, at least in part, on the first vehicles having a same make and model.What is claimed is:
| 1. A method that is executed by an onboard vehicle computer system of an ego vehicle, the method comprising:
determining first content data from a first set of vehicles that describes content for a road sign;
determining second content data from a second set of vehicles that describes the content for the road sign; and
accepting the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value.
| 2. The method of claim 1, further comprising:
assigning a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and
assigning a second weight to the second content data based on a number of vehicles in the second set of vehicles failing to have the same make and model;
wherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight.
| 3. The method of claim 2, further comprising:
determining a presence of a cluster of vehicles in the first set of vehicles having the same make and model responsive to the number of vehicles in the first set of vehicles exceeding the threshold value; and
lowering the first weight based on determining the presence of the cluster.
| 4. The method of claim 1, wherein the ego vehicle is included in the first set of vehicles.
| 5. The method of claim 1, wherein the ego vehicle included in the second set of vehicles.
| 6. The method of claim 1, wherein the ego vehicle receives a Vehicle-to-Anything (V2X) message transmitted by a remote vehicle that includes sensor data that includes an image of the road sign.
| 7. The method of claim 6, further comprising determining that the content for the road sign is the second content data based at least in part on the image of the road sign.
| 8. The method of claim 6, wherein determining the first content data includes determining a type of road sign or a rule described by the road sign.
| 9. The method of claim 1, further comprising:
receiving sensor data from the first set of vehicles and the second set of vehicles, wherein the sensor data includes images of the road sign; and
comparing the sensor data to a sign database, wherein determining the first content data and the second content data is based on comparing the sensor data to the sign database.
| 10. An onboard vehicle computer system of an ego vehicle comprising:
a processor communicatively coupled to a non-transitory memory that stores computer code that is operable, when executed by the processor, to cause the processor to:
determine first content data from a first set of vehicles that describes content for a road sign;
determine second content data from a second set of vehicles that describes the content for the road sign; and
accept the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value.
| 11. The system of claim 10, wherein the non-transitory memory stores additional computer code that is operable, when executed by the processor, to cause the processor to:
assign a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and
assign a second weight to the second content data based on a number of second vehicles in the second set of vehicles failing to have the same make and model;
wherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight.
| 12. The system of claim 11, wherein the non-transitory memory stores additional computer code that is operable, when executed by the processor, to cause the processor to:
determine a presence of a cluster of vehicles in the first set of vehicles having the same make and model responsive to the number of vehicles in the first set of vehicles exceeding the threshold value; and
lower the first weight based on determining the presence of the cluster.
| 13. The system of claim 10, wherein the ego vehicle is included in the first set of vehicles.
| 14. The system of claim 10, wherein the ego vehicle included in the second set of vehicles.
| 15. The system of claim 12, further comprising a Vehicle-to-Anything (V2X) radio communicatively coupled to the processor, wherein the V2X radio is operable to receive a V2X message transmitted by a remote vehicle that includes sensor data that includes an image of the road sign.
| 16. The system of claim 15, wherein determining the first content data includes determining a type of road sign or a rule described by the road sign.
| 17. A non-transitory computer program product that is an element of an onboard system of an ego vehicle comprising instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising:
determining first content data from a first set of vehicles that describes content for a road sign;
determining second content data from a second set of vehicles that describes the content for the road sign; and
accepting the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value.
| 18. The computer program product of claim 17, wherein the operations further comprise:
assigning a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and
assigning a second weight to the second content data based on a number of second vehicles in the second set of vehicles failing to have the same make and model;
wherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight.
| 19. The computer program product of claim 17, wherein the ego vehicle includes an Advanced Driver Assistance System (ADAS system) and the ADAS system of the ego vehicle receives digital data describing the determination of the second content data of the road sign generated by the computer program product and uses the determination of the second content data of the road sign to control an ADAS function of the ego vehicle.
| 20. The computer program product of claim 17, wherein the ego vehicle is an autonomous vehicle and the onboard system of the autonomous vehicle receives digital data describing the determination of the second content data of the road sign generated by the computer program product and uses the determination of the second content data of the road sign to autonomously control an operation of the autonomous vehicle. | The method involves determining a first set of first vehicles which is judged that a content for a traffic sign (160) is a first value, and determining the content for the traffic sign is a second value that is represented by a second set of second vehicles based on the first vehicles. The first set is numerically greater than the second set such that a majority of a group including the first vehicles and the second vehicles are judged that the content for the traffic sign is the first value. The method is performed by a vehicle-mounted computer system of an ego vehicle (123). INDEPENDENT CLAIMS are included for the following:a system for recognizing traffic sign for networked vehicles; anda computer program product for recognizing traffic sign for networked vehicles. Method for recognizing traffic sign for networked vehicles. The accuracy of the image recognition results of the ego vehicle is improved. The image recognition process is enhanced by correcting for artifact or pixel hyperbole. The control decisions are improved over the control decisions made by the ADAS system without the benefit of the digital data provided by the traffic sign system since the traffic sign is accurately described by digital data. The drawing shows a schematic view of the operating environment for a traffic sign system. (Drawing includes non-English language text) 100Operating environment120Bus122Networked device123Ego vehicle160Traffic sign |
Please summarize the input | ENHANCED VEHICLE-TO-EVERYTHING (V2X) COMMUNICATIONS USING A SATELLITE/AIRBORNE INTERFACEA system, method and apparatus for mobile communications including sidelink transmissions is provided. A user equipment (UE) maintains a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node. The UE switches between interfaces according to a current connectivity state and based on signal attributes associated with individual interfaces.|1-39. (canceled)
| 40. A method of wireless communications, comprising:
maintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node;
switching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of:
a first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface;
a second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and
a third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface;
wherein:
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.
| 41. The method of claim 40, wherein a signal strength attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.
| 42. The method of claim 40, wherein a signal quality attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.
| 43. The method of claim 40 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.
| 44. The method of claim 40, wherein the switching between the first connectivity state to the second connectivity state is based on:
comparing the first signal attribute with a first threshold;
comparing the second signal attribute with a second threshold; and
comparing the third signal attribute with a third threshold.
| 45. The method of claim 44, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.
| 46. The method of claim 44 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first threshold, the second threshold and the third threshold.
| 47. The method of claim 46, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.
| 48. The method of claim 47, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.
| 49. The method of claim 40, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.
| 50. The method of claim 49, wherein:
the first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets.
| 51. The method of claim 49 further comprising, determining by the first user equipment (UE), the quality of service (QoS) requirements of the data packets based on priority tags of the data packets.
| 52. The method of claim 51, wherein the priority tags of the data packets indicate that the data packets are associated with one or more of:
safety related information;
firefighters or government agencies;
autonomous driving; and
traffic information.
| 53. The method of claim 40 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).
| 54. The method of claim 40, wherein:
the maintaining the first interface, the second interface and the third interface comprises registering by the first user equipment (UE) with a Core Network node,
wherein a context of the first UE associated with the first interface, the second interface and the third interface is established in the Core Network.
| 55. The method of claim 40, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of:
a first duration for which the first signal attribute maintains at least one of a first signal strength or signal quality requirement;
a second duration for which the second signal attributes maintains at least one of a second signal strength or signal quality requirement; and
a third duration for which the third signal attribute maintains at least one of a third signal strength or signal quality requirement.
| 56. The method of claim 55, wherein
the first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state;
wherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and
the first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold.
| 57. The method of claim 56 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first duration threshold, the second duration threshold and the third duration threshold.
| 58. A first user equipment (UE), comprising a controller configured to execute processes of:
maintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node;
switching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of:
a first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface;
a second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and
a third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface;
wherein:
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.
| 59. The first UE of claim 58, wherein the switching between the first connectivity state to the second connectivity state is based on:
comparing the first signal attribute with a first threshold;
comparing the second signal attribute with a second threshold; and
comparing the third signal attribute with a third threshold.
| 60. The first UE of claim 59, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.
| 61. The first UE of claim 59 further comprising receiving, by the first UE, configuration parameters indicating the first threshold, the second threshold and the third threshold.
| 62. The first UE of claim 61, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.
| 63. The first UE of claim 62, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.
| 64. The first UE of claim 58 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).
| 65. A method of wireless communications, comprising:
sending parameters to a first user equipment (UE) for maintaining, by the first user equipment, a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node;
sending a message to command the first UE to switch between a first connectivity state and a second connectivity state based on at least one of:
a first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface;
a second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and
a third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface;
wherein:
the base station receives the first signal attribute, the second signal attribute and the third signal attribute from the first UE;
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.
| 66. A base station comprising a controller configured to execute processes of:
sending parameters to a first user equipment (UE) for maintaining, by the first user equipment, a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node;
sending a message to command the first UE to switch between a first connectivity state and a second connectivity state based on at least one of:
a first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface;
a second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and
a third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface;
wherein:
the base station receives the first signal attribute, the second signal attribute and the third signal attribute from the first UE;
the first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and
the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. | The method involves maintaining a set of interfaces including a first interface between a user equipment (UE) (125A) and a radio access network (RAN) infrastructure node, a second interface between the UE and another UE and a third interface between the UE and a satellite node or airborne node. The UE is switched between a first connectivity state and a second connectivity state based on one of a first signal attribute associated with the first interface, where the UE communicates in accordance with one of the interfaces when the UE is determined to be in the first state, and UE communicates with the third interface when UE is in the second state. Method for realizing wireless communications of a mobile communication system between a user equipment and a base station (all claimed) by a wireless communications system operator to provide services in residential, commercial or industrial settings such as IoT and industrial IOT (HOT). Uses include but are not limited to a mobile network operator (MNO), a private network operator, a multiple system operator (MSO), an internet-of-things (JOT) network operator, voice, data, messaging, vehicular communications services such as vehicle to everything (V2X) communications services, safety services and mission critical service, smartphones, tablets, laptops, computers, wireless transmission and/or reception units in vehicle. The method enables utilizing the computing devices to utilize the wireless communication network to facilitate interactions with other devices that can access the network or to facilitate interaction, through the network, with devices utilizing other communication networks in an efficient manner. The drawing shows a schematic view of a mobile communication system.100Mobile communications 105Next generation ran 115GNb 120ng-evolved node b (eNB) 125AUser equipment |
Please summarize the input | AUTOMATED VALET PARKING SYSTEM AND CONTROL METHOD OF AUTOMATED VALET PARKING SYSTEMIt is an automatic parking system having a parking control server that parks an autonomous vehicle in a target parking space in a parking lot, wherein the parking control server includes position information of an autonomous vehicle in the parking lot and position information of a general vehicle manually driven by a driver in the parking lot a vehicle information acquisition unit for acquiring, a notification target vehicle detection unit that detects a first notification target vehicle that is an autonomous driving vehicle that is a notification target for the server communication capable vehicle, and when the first notification target vehicle is detected by the notification target vehicle detection unit, the server communication capable vehicle and a notification unit for notifying the existence of the first notification target vehicle.|1. An automatic parking system having a parking control server that parks the self-driving vehicle in a target parking space in the parking lot by instructing the self-driving vehicle in the parking lot, wherein the parking control server includes location information of the self-driving vehicle in the parking lot And a vehicle information acquisition unit for acquiring location information of a general vehicle manually driven by a driver in the parking lot, and a communication availability determination unit for determining whether the general vehicle is a server communication capable vehicle capable of communicating with the parking lot control server; a notification target vehicle detection unit configured to detect a first notification target vehicle, which is the self-driving vehicle, which is a notification target for the server communication capable vehicle, based on the location information of the autonomous driving vehicle and the location information of the server communication capable vehicle; and a notification unit that, when the first notification target vehicle is detected by the notification target vehicle detection unit, notifies the server communication capable vehicle of the existence of the first notification target vehicle; automatic parking system.
| 2. The vehicle detection unit according to claim 1, wherein the notification target vehicle detection unit is based on the location information of the server communication incapable vehicle and the self-driving vehicle position information of the general vehicle that is determined not to be the server communication capable vehicle by the communication availability determination unit. to detect a second notification target vehicle, which is the self-driving vehicle as a notification target for the server communication incapable vehicle, and when vehicle-to-vehicle communication between the server communication incapable vehicle and the second notification target vehicle is possible, the second notification target vehicle 2 An automatic parking system that connects vehicle-to-vehicle communication between a vehicle to be notified and a vehicle incapable of communicating with the server.
| 3. The distance threshold value according to claim 2, wherein the informing unit sets the inter-vehicle distance between the server communication incapable vehicle and the second notification target vehicle with respect to the second notification target vehicle connecting the server communication incapable vehicle and vehicle-to-vehicle communication. Approach of the second notification target vehicle to the server communication incapable vehicle through vehicle-to-vehicle communication Instructing an approach notification reservation to notify an automatic parking system.
| 4. The method according to any one of claims 1 to 3, wherein the parking control server is configured to, when the general vehicle is approaching from the rear of the self-driving vehicle, or when the general vehicle crosses in front of the self-driving vehicle. and a stop instructing unit for stopping the self-driving vehicle until the general vehicle passes by when the vehicle is being driven.
| 5. A control method of an autonomous parking system having a parking control server for parking the self-driving vehicle in a target parking space in the parking lot by instructing the self-driving vehicle in the parking lot, wherein the location information of the self-driving vehicle in the parking lot and the parking lot A vehicle information acquisition step of acquiring location information of a general vehicle manually driven by a driver within a vehicle information acquisition step; a communication availability determination step of determining whether the general vehicle is a server communication capable vehicle capable of communicating with the parking lot control server; a first notification target vehicle detection step of detecting a first notification target vehicle that is the self-driving vehicle as a notification target for the server communication capable vehicle based on the location information of the driving vehicle and the location information of the server communication capable vehicle; a notification step of notifying the server communication capable vehicle of the existence of the first notification target vehicle when the first notification target vehicle is detected in the first notification target vehicle detection step; Control method of automatic parking system. | The system (1) has a vehicle information acquisition unit that acquires positional information of the autonomous driving vehicle (2) in the parking place and positional information of a general vehicle (3) manually driven by a driver in the parking place. A communication availability determination unit determines whether or not the general vehicle is a server communicable vehicle that is able to communicate with the parking place control server. A notification target vehicle detection unit detects a first notification target vehicle that is the autonomous driving vehicle as a notification target for the server communicable vehicle based on the positional information of the autonomous driving vehicle. A notification unit is configured to notify the server communicable vehicle of presence of the first notification target vehicle when the notification target vehicle detection unit detects the first notification target vehicle. An INDEPENDENT CLAIM is included for a method for controlling automated valet parking system. Automated valet parking system. The system prevents the server communicable vehicle in the parking place from erroneously recognizing the autonomous driving vehicle as the general vehicle. The drawing shows the schematic diagram of an automated valet parking system.1Automated valet parking system 2Autonomous driving vehicle 3General vehicle 4Parking place sensor 10Communication unit |
Please summarize the input | MODIFYING A VEHICULAR RADIO BASED ON A SCHEDULE OF POINT-TO-POINT VEHICULAR COMMUNICATIONSThe disclosure includes embodiments for modifying a vehicle-to-everything (V2X) radio of an ego vehicle that is a connected vehicle. In some embodiments, a method includes analyzing, by a machine learning module executed by a processor, a local dynamic map generated by the ego vehicle to determine schedule data describing a schedule for the ego vehicle to transmit a millimeter wave (mmWave) message to a remote vehicle. The method includes transmitting a V2X message including the schedule data for receipt by the remote vehicle so that the remote vehicle has access to the schedule. The method includes modifying an operation of the V2X radio of the ego vehicle based on the schedule so that the V2X radio transmits the mmWave message to the remote vehicle in compliance with the schedule. The method includes transmitting the mmWave message to the remote vehicle in compliance with the schedule.What is claimed is:
| 1. A method for an ego vehicle, comprising:
detecting an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint;
determining scenario data describing a scenario of one or more of the first endpoint and the second endpoint;
requesting a recommended beam alignment setting from a server based on the scenario data;
receiving feedback data describing the recommended beam alignment setting from the server; and
modifying an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting.
| 2. The method of claim 1, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.
| 3. The method of claim 1, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.
| 4. The method of claim 1, further comprising:
generating a beam request message including the scenario data; and
transmitting the beam request message to the server via a V2X network;
wherein the feedback data is based on the beam request message.
| 5. The method of claim 4, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.
| 6. The method of claim 1, further comprising:
modifying an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting.
| 7. The method of claim 1, further comprising:
generating mmWave performance data related to an exchange of the mmWave message using the recommended beam alignment setting; and
uploading the mmWave performance data to the server.
| 8. A system comprising:
an onboard vehicle computer system of an ego vehicle including a non-transitory memory storing computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system to:
detect an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint;
determine scenario data describing a scenario of one or more of the first endpoint and the second endpoint;
request a recommended beam alignment setting from a server based on the scenario data;
receive feedback data describing the recommended beam alignment setting from the server; and
modify an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting.
| 9. The system of claim 8, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.
| 10. The system of claim 8, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.
| 11. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to:
generate a beam request message including the scenario data; and
transmit the beam request message to the server via a V2X network;
wherein the feedback data is based on the beam request message.
| 12. The system of claim 11, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.
| 13. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to:
modify an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting.
| 14. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to:
generate mmWave performance data related to an exchange of the mmWave message using the recommended beam alignment setting; and
upload the mmWave performance data to the server.
| 15. A computer program product comprising a non-transitory memory of an onboard vehicle computer system of an ego vehicle storing computer-executable code that, when executed by a processor, causes the processor to:
detect an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint;
determine scenario data describing a scenario of one or more of the first endpoint and the second endpoint;
request a recommended beam alignment setting from a server based on the scenario data;
receive feedback data describing the recommended beam alignment setting from the server; and
modify an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting.
| 16. The computer program product of claim 15, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.
| 17. The computer program product of claim 15, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.
| 18. The computer program product of claim 15, wherein the computer-executable code further causes the processor to:
generate a beam request message including the scenario data; and
transmit the beam request message to the server via a V2X network;
wherein the feedback data is based on the beam request message.
| 19. The computer program product of claim 18, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.
| 20. The computer program product of claim 15, wherein the computer-executable code further causes the processor to:
modify an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting. | The method involves detecting (501) intention of a first endpoint to exchange a millimeter wave message with a second endpoint. Scenario data describing scenario of the endpoints is determined (503). A recommended beam alignment setting is requested from a server based on the scenario data. Feedback data describing the recommended beam alignment setting requested from the server is received. Operation of vehicle-to-everything (V2X) radio of the first endpoint is modified based on recommended beam such that the V2Xradio exchanges the millimeter wave message with the second endpoint by using the recommended beam alignment setting. INDEPENDENT CLAIMS are included for:(1) a system for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles; and(2) a computer program product comprising a non-transitory memory for storing computer-executable code to execute a method for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles. Method for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles. The method enables facilitating use of high data rate millimeter wave communications for application by optimizing scheduling of millimeter wave communication in environment in which vehicles share data received from sensors. The method allows a feedback loop to determine success of the millimeter wave message and update schedule based on success rate so that likelihood of target success of target millimeter wave message is increased. The drawing shows a sequential diagram illustrating a method for modifying V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles.501Step for detecting intention of first endpoint to exchange millimeter wave message with second endpoint 503Step for determining scenario data describing scenario of endpoints 507Step for generating beam report message including scenario data and beam 508Step for transmitting beam report message to server via V2X network 509Step for receiving beam report message from ego vehicle |
Please summarize the input | Ego-vehicles, systems, and methods for monitoring target objectsAn ego-vehicle for displaying a behavior of a target object in a spatio-temporal manner may include one or more processors. One or more memory modules are communicatively coupled to the one or more processors. A display is communicatively coupled to the one or more processors. One or more sensors are communicatively coupled to the one or more processors. Machine readable instructions are stored in the one or more memory modules and cause the one or more processors to display on the display an object indicator associated with a position of a target object relative to the ego-vehicle, wherein the object indicator depicts a spatio-temporal patterning indicating the behavior of the target object.What is claimed is:
| 1. An ego-vehicle for displaying a behavior of a target object in a spatio-temporal manner, the ego-vehicle comprising:
one or more processors;
one or more memory modules communicatively coupled to the one or more processors;
a display communicatively coupled to the one or more processors;
one or more sensors communicatively coupled to the one or more processors; and
machine readable instructions stored in the one or more memory modules that cause the one or more processors to, based on a signal from the one or more sensors, display on the display an object indicator associated with a position of the target object relative to the ego-vehicle, wherein:
the object indicator depicts a spatio-temporal patterning indicating the behavior of the target object; and
the spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object.
| 2. The ego-vehicle of claim 1, wherein the machine readable instructions cause the one or more processors to adjust the spatio-temporal patterning of the object indicator in both time and space to indicate the behavior of the target object at any given time.
| 3. The ego-vehicle of claim 1, wherein:
the spatio-temporal patterning of the object indicator further indicates a direction the target object is traveling; and
the machine readable instructions further cause the one or more processors to move the object indicator across the display to correspond with a movement of the target object within a vicinity of the ego-vehicle.
| 4. The ego-vehicle of claim 1, wherein the spatio-temporal patterning of the object indicator indicates the speed and direction of travel of the target object relative to a speed and direction of travel of the ego-vehicle.
| 5. The ego-vehicle of claim 1, wherein the spatio-temporal patterning of the object indicator indicates an absolute speed and direction of travel of the target object.
| 6. The ego-vehicle of claim 5, wherein:
the machine readable instructions further cause the one or more processors to display an ego-vehicle indicator on the display, wherein the ego-vehicle indicator depicts a spatio-temporal patterning indicating at least the absolute speed of the ego-vehicle; and
the object indicator is displayed adjacent to the ego-vehicle indicator at a position on the display corresponding to the position of the target object within a vicinity of the ego-vehicle.
| 7. The ego-vehicle of claim 6, wherein:
the spatio-temporal patterning of the ego-vehicle indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to the absolute speed of the ego-vehicle and a direction of flow indicative of a direction of travel of the ego-vehicle; and
the at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a frequency proportional to the absolute speed of the target object and a direction of flow indicative of the direction of travel of the target object.
| 8. The ego-vehicle of claim 1, wherein the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the ego-vehicle.
| 9. The ego-vehicle of claim 1, wherein the frequency of the spatio-temporal patterning of the object indicator adjusts to correspond to changes in the speed of the target object.
| 10. The ego-vehicle of claim 1, wherein the machine readable instructions further cause the one or more processors to identify a target object type, wherein the object indicator indicates the target object type.
| 11. The ego-vehicle of claim 1, wherein the ego-vehicle is an autonomous vehicle.
| 12. The ego-vehicle of claim 1, wherein the machine readable instructions further cause the one or more processors to:
identify road parameters, wherein the road parameters include at least a lane of a road; and
display the road parameters on the display, wherein the object indicator is displayed in the lane corresponding to the lane the target object is in.
| 13. The ego-vehicle of claim 1, wherein the display is at least one of a heads-up display, an instrument cluster display, a navigation display, and a mobile device display.
| 14. A system for displaying a behavior of a target object in a spatio-temporal manner relative to the system, the system comprising:
one or more processors;
one or more memory modules communicatively coupled to the one or more processors;
a display communicatively coupled to the one or more processors;
one or more sensors communicatively coupled to the one or more processors; and
machine readable instructions stored in the one or more memory modules that cause the one or more processors to, based on a signal from the one or more sensors, display on the display an object indicator associated with a position of the target object relative to the system, wherein:
the object indicator depicts a spatio-temporal patterning indicating the behavior of the target object relative to the system; and
the spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object.
| 15. The system of claim 14, wherein:
the at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the system; and
the frequency of the spatio-temporal patterning adjusts to correspond to the changes in the speed and the direction of travel of the target object relative to the system.
| 16. The system of claim 14, wherein the one or more sensors include at least one of a camera, LiDAR, RADAR, and vehicle-to-vehicle communication.
| 17. The system of claim 14, wherein the machine readable instructions further cause the one or more processors to move the object indicator across the display to correspond with a movement of the target object within a vicinity of the system.
| 18. A method of displaying a behavior of a target object in a spatio-temporal manner relative to an ego-vehicle comprising:
detecting, with one or more sensors, the target object in a vicinity of the ego-vehicle;
monitoring a speed, direction of travel, and position of the target object relative to the ego-vehicle;
displaying, with one or more processors, an object indicator associated with the position of the target object relative to the ego-vehicle on a display, wherein:
the object indicator depicts a spatio-temporal patterning indicating the behavior of the target object; and
the spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object; and
adjusting the spatio-temporal patterning of the object indicator on the display to correspond with the behavior of the target object within the vicinity of the ego-vehicle.
| 19. The method of claim 18, wherein:
the at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the ego-vehicle.
| 20. The method of claim 18, further comprising:
identifying road parameters, wherein road parameters include at least a lane of a road; and
displaying virtual road parameters on the display, wherein:
the virtual road parameters include at least a virtual lane corresponding to the lane of the road; and
the object indicator of the target object is displayed in the virtual lane corresponding to the lane the target object is in. | The ego-vehicle (100) has processors, memory modules (106) communicatively coupled to processors, display (108) communicatively coupled to processors, sensors (120) communicatively coupled to processors, and machine readable instructions stored in memory modules that cause the processors to display an object indicator associated with position of target object (200) relative to ego-vehicle on the display, such that the object indicator depicts spatio-temporal patterning indicating the behavior of target object. The machine-readable instructions cause the processors to adjust spatio-temporal patterning of object indicator in both time and space to indicate behavior of the target object at any given time. The spatio-temporal patterning of object indicator indicates speed and direction of travel of target object relative to speed and direction of travel of ego-vehicle. The display is a heads-up display, an instrument cluster display, a navigation display, or a mobile device display. INDEPENDENT CLAIMS are also included for the following:a system for displaying behavior of target object in spatio-temporal manner relative to system; anda method of displaying behavior of target object in spatio-temporal manner relative to ego-vehicle. Ego-vehicle, such as autonomous vehicle, for displaying behavior of target object in spatio-temporal manner. Helps drivers of autonomous vehicles who might only be checking in periodically with road conditions to be able to quickly and efficiently understand the motions of objects outside of their vehicle. The drawing shows a schematic diagram of the ego-vehicle illustrating a communication path. 100Ego-vehicle106Memory modules108Display120Sensors200Target object |
Please summarize the input | Vehicle systems and methods for presenting penetration metric information on a routeSystems for presenting dedicated short range communication (DSRC) penetration metric information on route are provided. A system for DSRC presenting penetration metric information on route includes a screen, one or more processors, one or more memory modules communicatively coupled to the one or more processors, and machine readable instructions stored in the one or more memory modules. When executed by the one or more processors, the machine readable instructions may cause the system to generate at least one route between a start location and a destination, receive, from a cloud server, DSRC penetration metric information related to the at least one route, select one route among the least one route based on the DSRC penetration metric information, and display the selected route along with the DSRC penetration metric information for the selected route on the screen.What is claimed is:
| 1. A vehicle system for presenting dedicated short rage communication (DSRC) penetration metric information on routes, the system comprising:
a screen;
one or more processors;
one or more memory modules communicatively coupled to the one or more processors; and
machine readable instructions stored in the one or more memory modules that cause the vehicle system to perform at least the following when executed by the one or more processors:
generate two or more routes between a start location and a destination;
receive, from a cloud server, DSRC penetration metric information related to each of the two or more routes, the DSRC penetration metric information including a penetration rate for each of the two or more routes;
select one route among the two or more routes based on the DSRC penetration metric information; and
display the selected route along with the DSRC penetration metric information for the selected route on the screen.
| 2. The vehicle system of claim 1, wherein selecting one route among the two or more routes based on the DSRC penetration metric information comprises:
calculating a penetration score for each of the two or more routes based on the DSRC penetration metric information related to each of the two or more routes; and
selecting a route with a highest penetration score among the two or more routes.
| 3. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises:
dividing the selected route into a plurality of segments based on the penetration rate for each of the plurality of segments;
displaying each of the plurality of segments in a predetermined color based on the penetration rate.
| 4. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises:
dividing the selected route into a plurality of segments based on the penetration rate for each of the plurality of segments;
displaying each of the plurality of segments with a predetermined identifier based on the penetration rate.
| 5. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises an autonomous vehicle rate.
| 6. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises historical DSRC penetration metric information.
| 7. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises real-time DSRC penetration metric information.
| 8. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises
displaying the selected route along with a penetration rate and an autonomous vehicle rate for the selected route.
| 9. The vehicle system of claim 1, wherein the penetration rate percentage comprises a percentage of vehicles equipped with DSRC.
| 10. A cloud server for providing DSRC penetration metric information on a route, the cloud server comprising:
a database storing historical DSRC penetration metric information including a historical penetration rate for the route;
a penetration rate estimator configured to estimate DSRC penetration metric information on the route at least based on the historical DSRC penetration metric information related to the route;
a network interface configured to
receive the route from a vehicle; and
provide the estimated DSRC penetration metric information on the route to the vehicle,
wherein the historical penetration rate is a historical percentage of vehicles equipped with DSRC functionality.
| 11. The cloud server of claim 10, wherein the network interface is further configured to receive information from a plurality of vehicles driving on the route in real time, and the penetration rate estimator is configured to estimate DSRC penetration metric information further based on the information from the plurality of vehicles.
| 12. The cloud server of claim 10, wherein the network interface is further configured to receive traffic information on the route in real time, and the penetration rate estimator is configured to estimate DSRC penetration metric information further based on the traffic information.
| 13. The cloud server of claim 11, wherein the information from a plurality of vehicles includes information on whether or not the vehicle has vehicle-to-vehicle communication functionality.
| 14. The cloud server of claim 13, wherein the vehicle-to-vehicle communication functionality comprises DSRC.
| 15. The cloud server of claim 11, wherein the information from a plurality of vehicles includes information on whether or not the vehicle is driving in an autonomous mode.
| 16. The cloud server of claim 11, wherein the information from a plurality of vehicles includes location information on the plurality of vehicles.
| 17. The cloud server of claim 11, wherein the network interface is further configured to receive information from a plurality of vehicles through vehicle-to-infrastructure (V2I) communication.
| 18. The cloud server of claim 11, wherein the database is updated in real time based on the information from the plurality of vehicles.
| 19. The cloud server of claim 10, wherein the penetration rate estimator is configured to divide the route into one or more segments based on the historical DSRC penetration metric information and estimate DSRC penetration metric information on the one or more segments. | The system (300) has a screen, multiple processors (302) and multiple memory modules (306) communicatively coupled to multiple processors. The machine readable instructions stored in multiple memory modules that cause vehicle system to perform following instruction when executed by multiple processors. The DSRC penetration metric to one route receives from a cloud server. One route selects among one route based on DSRC penetration metric information. The selected route displays along with DSRC penetration metric information for selected route on screen. Vehicle system for presenting dedicated short rage communication (DSRC) penetration metric information of cloud server (claimed) for providing visual output such as maps, navigation, entertainment, penetration metric information etc. The penetration estimator can utilize the real-time traffic information in estimating penetration metric information. The driver or an autonomous vehicle can drive with less probability of accidents by selecting the route with a high penetration rate because the drive or the autonomous vehicle can have an increased awareness of the environment with a higher DSRC penetration rate. The drawing shows a schematic view of a system. 300Vehicle system302Processor304Communication path306Memory module308Display |
Please summarize the input | AUTONOMOUS-MODE TRAFFIC LANE SELECTION BASED ON TRAFFIC LANE CONGESTION LEVELSA method and device for an autonomous vehicle control unit for traffic lane selection are disclosed. In operation, a present traffic lane in relation to each of a plurality of traffic lanes for a roadway is identified. A traffic congestion level is determined for the each of the plurality of traffic lanes, and compared with each other to determine a lowest-congested traffic lane of the plurality of traffic lanes. When the lowest-congested traffic lane is other than the present traffic lane, a traffic lane change command is generated that includes identifier data for an adjacent traffic lane having a lower traffic congestion level. The traffic lane change command is transmitted to effect a traffic lane change of the vehicle from the present traffic lane to an adjacent traffic lane.What is claimed is:
| 1. A method in an autonomous vehicle control unit for traffic lane selection from a roadway having a plurality of traffic lanes in a common direction of travel, the method comprising:
identifying a present traffic lane in relation to each of the plurality of traffic lanes;
determining a traffic congestion level for the each of the plurality of traffic lanes;
comparing the traffic congestion level for the each of the plurality of traffic lanes to determine a lowest-congested traffic lane of the plurality of traffic lanes;
when the lowest-congested traffic lane is other than the present traffic lane:
generating a traffic lane change command including identifier data for an adjacent traffic lane having a lower traffic congestion level; and
transmitting the traffic lane change command to effect a traffic lane change from the present traffic lane to the adjacent traffic lane.
| 2. The method of claim 1, wherein the determining the traffic congestion level for the each of the plurality of traffic lanes comprising:
sensing a vehicle positioned ahead along the common direction of travel; and
determining a distance to the vehicle to produce the traffic congestion level.
| 3. The method of claim 1, wherein the determining the traffic congestion level for the each of the plurality of traffic lanes comprising:
sensing a vehicle positioned ahead along the common direction of travel; and
detecting a closing of a longitudinal distance to the vehicle; and
determining a rate of the closing of the longitudinal distance to the vehicle to produce the traffic congestion level.
| 4. The method of claim 1, wherein the transmitting the traffic lane change command further comprising:
transmitting the traffic lane change command to a powertrain control unit; and
broadcasting the traffic lane change command.
| 5. The method of claim 4, wherein the broadcasting the traffic lane change command comprising:
a vehicle-to-vehicle communication; and
a vehicle-to-infrastructure communication.
| 6. A method in a vehicle control unit for traffic lane selection of a roadway for an autonomous vehicle operation, the method comprising:
determining a traffic congestion condition for the roadway;
when the traffic congestion condition exceeds a threshold, determining whether the roadway includes a plurality of traffic lanes for travel in a uniform travel direction; and
when the roadway includes the plurality of traffic lanes:
identifying a present traffic lane in relation to each of the plurality of traffic lanes;
determining a traffic congestion level for the each of the plurality of traffic lanes;
comparing the traffic congestion level for the each of the plurality of traffic lanes to determine whether the present traffic lane is a lowest-congested traffic lane of the plurality of traffic lanes; and
when the lowest-congested traffic lane is other than the present traffic lane, traversing the roadway to the lowest-congested traffic lane by:
generating a traffic lane change command identifying an adjacent traffic lane; and
transmitting the traffic lane change command to effect a lane change from the present traffic lane to the adjacent traffic lane.
| 7. The method of claim 6, further comprising:
when the lowest-congested traffic lane is other than the adjacent traffic lane, again traversing the roadway to the lowest-congested traffic lane by:
generating another traffic lane change command including identifier data for a next adjacent traffic lane; and
transmitting the another traffic lane change command to effect a traffic lane change from the present traffic lane to the next adjacent traffic lane.
| 8. The method of claim 6, wherein the determining the traffic congestion condition for the roadway comprising:
retrieving location data;
requesting, based on the location data, map layer data including roadway information;
receiving, in response, the map layer data indicating a present traffic speed for the roadway relative to a free-flowing traffic speed; and
processing the map layer data to produce the traffic congestion condition for the roadway.
| 9. The method of claim 6, wherein the threshold indicates less than a free-flowing traffic speed for the roadway.
| 10. The method of claim 6, wherein the traffic congestion condition for the roadway being based on sensing vehicle-to-vehicle communication levels.
| 11. The method of claim 10, wherein the sensing the vehicle-to-vehicle communication levels including sensing a volume of vehicle-to-vehicle communication collisions.
| 12. The method of claim 6, wherein the determining the traffic congestion condition for the roadway comprising:
receiving a vehicle-to-infrastructure communication message;
retrieving message data from the vehicle-to-infrastructure communication message;
determining a congestion value for the message data; and
assigning the congestion value to the traffic congestion condition.
| 13. The method of claim 6, wherein the determining whether the roadway includes the plurality of traffic lanes in the uniform travel direction comprising:
retrieving location data;
requesting, based on the location data, map layer data including roadway information data; and
receiving, in response, the map layer data.
| 14. The method of claim 13, wherein the map layer data comprises a Route Network Description File indicating an amount of the traffic lanes for the roadway.
| 15. The method of claim 6, wherein the determining whether the roadway includes the plurality of traffic lanes in the uniform travel direction comprising:
receiving vehicle sensor data;
determining roadway features based on the vehicle sensor data;
inferring from the roadway features to infer more than one traffic lane; and
generating an initial estimate of traffic lane geometry.
| 16. The method of claim 6, wherein the transmitting the traffic lane change command to effect the lane change from the present traffic lane to the adjacent traffic lane further comprising:
transmitting the traffic lane command to a powertrain control unit; and
broadcasting the traffic lane command.
| 17. A vehicle control unit for traffic lane selection comprising:
a wireless communication interface to service communication with a vehicle network and user equipment of a vehicle user;
a processor coupled to the wireless communication interface, the processor for controlling operations of the vehicle control unit; and
a memory coupled to the processor, the memory for storing data and program instructions used by the processor, the processor configured to execute instructions stored in the memory to:
identify a present traffic lane in relation to each of the plurality of traffic lanes;
determine a traffic congestion level for the each of the plurality of traffic lanes;
compare the traffic congestion level for the each of the plurality of traffic lanes to determine a lowest-congested traffic lane of the plurality of traffic lanes; and
when the lowest-congested traffic lane is other than the present traffic lane:
generate a traffic lane change command including identifier data for an adjacent traffic lane having a lower traffic congestion level; and
transmit the traffic lane change command to effect a traffic lane change from the present traffic lane to the adjacent traffic lane.
| 18. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to determine the traffic congestion level for the each of the plurality of traffic lanes by:
sensing a vehicle positioned ahead along the common direction of travel; and
determining a distance to the vehicle to produce the traffic congestion level.
| 19. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to determine the traffic congestion level for the each of the plurality of traffic lanes by:
sensing a vehicle positioned ahead along the common direction of travel; and
detecting a closing of a longitudinal distance to the vehicle; and
determining a rate of the closing of the longitudinal distance to the vehicle to produce the traffic congestion level.
| 20. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to transmit the traffic lane change command by:
transmitting the traffic lane command to a powertrain control unit; and
broadcasting the traffic lane command. | The method involves identifying a present traffic lane in relation to a set of traffic lanes. Traffic congestion level is determined for the traffic lanes. The traffic congestion level for the traffic lanes is compared to determine a lowest-congested traffic lane of the traffic lanes. A traffic lane change command (240) including identifier data is generated for an adjacent traffic lane having a lower traffic congestion level. The traffic lane change command is transmitted to effect a traffic lane change from the present traffic lane to the adjacent traffic lane. An INDEPENDENT CLAIM is also included for a vehicle control unit. Method for facilitating traffic lane selection from a roadway in a control unit (claimed) i.e. audio/visual control unit of an autonomous vehicle using a handheld mobile device. Uses include but are not limited to a passenger car, passenger lorry, semi-lorry, cargo van, emergency or first response vehicle and a transport vehicle, using a smart phone, personal digital assistant (PDA) device, tablet computer and e-reader. The method enables providing a vehicular interface for a driver to interact with vehicle systems, interactive displays, audio systems, voice recognition systems, buttons and dials and haptic feedback systems for inputting or outputting information. The method enables allowing vehicle-to-infrastructure communications to broadcast traffic stoppage points and provide advance indication to the autonomous vehicle control unit about oncoming traffic congestion, beacons and vehicle-to-infrastructure devices to gather local traffic information and local traffic congestion, and broadcast the gathered data. The method enables using a light detection and ranging (LIDAR) to determine distance between a sensor input device and an object with a high degree of accuracy due to moving of light at a constant speed. The method enables using a touch screen for providing visual information and detecting presence and location of a tactile input upon a surface of or adjacent to the display. The drawing shows a schematic block diagram of a vehicle control unit in a context of a vehicle network environment. 200Autonomous vehicle control unit201Vehicle network environment202Head unit device240Traffic lane change command248Powertrain Control Unit |
Please summarize the input | METHOD AND APPARATUS FOR WEATHER SUPPORT FOR AN AUTONOMOUS VEHICLEA method and an apparatus for weather support are provided. In an embodiment, a request from a user to check weather information is received, and trip information provided by the request is also identified. The weather information in response to the request is retrieved and output to the user. In another embodiment, user information of the user and the trip information of the trip are captured. The weather information corresponding to the trip is retrieved and output to the user. In addition, trip suggestions are provided to the user. In yet another embodiment, the trip information is retrieved and available routes are identified corresponding to the trip. Further, respective weather information, respective traffic information, and respective physical information of each of the available routes are retrieved. Route for the trip is determined based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes.What is claimed is:
| 1. A method for weather support, comprising:
receiving, via interface circuitry of an apparatus installed in a vehicle, a request to check weather information;
identifying, by processing circuitry of the apparatus, an expected location and expected time of a trip provided by the request, retrieving the weather information corresponding to the trip, and outputting the weather information and trip suggestions in response to the request;
capturing, via the interface circuitry, user information and trip information of the trip, retrieving the weather information corresponding to the trip, outputting the weather information corresponding to the trip, and providing trip suggestions to the user according to the user information, the trip information and the weather information; and
retrieving, via the interface circuitry, the trip information of the trip, identifying a plurality of available routes associated with the trip, retrieving respective weather information of each of the plurality of available routes, retrieving respective physical information of each of the plurality of available routes, retrieving respective traffic information of each of the plurality of available routes, determining a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and outputting the determined route for the trip.
| 2. The method of claim 1, further comprising:
receiving the request from the user through a microphone;
identifying the expected location and the expected time of the trip provided by the request through a voice recognition technique;
retrieving the weather information corresponding to the trip through at least one of a weather website, an application, and a data sore;
outputting the weather information in response to the request through a speaker or a display device;
providing the trip suggestions according to the weather information through the speaker or the display device.
| 3. The method of claim 2, wherein the retrieving the weather information corresponding to the trip comprises:
retrieving weather information on a specific day or at specific time;
retrieving weather information for a duration of days or a period of time;
retrieving weather information for a specific location; and
retrieving current weather information at current location.
| 4. The method of claim 2, wherein the providing the trip suggestions comprises:
providing suggestions on trip supplies; and
providing suggestions on trip safety.
| 5. The method of claim 1, further comprising:
capturing the user information of the user through a camera, the user information including trip supplies that the user prepares for the trip and dress that the user wears;
capturing the trip information through at least one of a navigation system installed in the vehicle, a portable communication device of the user, and the request input via a microphone by the user, the trip information including the expected location and the expected time of the trip;
retrieving the weather information corresponding to the trip through at least one of a weather web site, an application, and a data store;
outputting the weather information corresponding to the trip through a speaker or a display device; and
providing the trip suggestions to the user on the trip supplies and trip safety through the speaker or the display device according to the captured user information, the captured trip information and the retrieved weather information.
| 6. The method of claim 5, wherein the capturing the user information of the user is operated through at least one of image recognition, pattern recognition, feature recognition, and signal recognition.
| 7. The method of claim 5, further comprising:
training a machine learning algorithm based on the captured user information, and deploying the trained machine learning algorithm to identify similar user information in a future event.
| 8. The method of claim 1, further comprising:
retrieving the trip information of the trip through at least one of a navigation system installed in the vehicle, a portable communication device of the user, and the request input via a microphone by the user;
identifying the plurality of available routes associated with the trip through a map database;
retrieving the respective weather information of each of the plurality of available routes through at least one of a weather website, an application, and/or a data store;
retrieving the respective physical information of each of the plurality of available routes through at least one of a vehicle to infrastructure system, a cloud-based system, an application, and/or a data store;
retrieving the respective traffic information of each of the plurality of available routes through at least one of a traffic system, a cloud-based system, an application, and a data store;
determining the route for the trip from the available routes based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes; and
outputting the route for the trip to at least one of a speaker, a display device, and a navigation system.
| 9. The method of claim 8, wherein the determining the route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes comprising:
selecting routes having no weather hazard or least weather hazard from the plurality of available routes based on the respective weather information of each of the plurality of available routes;
selecting routes having no traffic issues or fewest traffic issues from the routes that have no weather hazard or least weather hazards based on the respective traffic information of each of the plurality of available routes;
selecting routes having no physical issues or fewest physical issues from the routes that have no traffic issues or fewest traffic issues based on the respective physical information of each of the plurality of available routes; and
determining the route for the trip from the routes that has no physical issues or the fewest physical issues based on at least one of total driving time, driving costs, or driving distance.
| 10. The method of claim 8, further comprising:
outputting the determined route for the trip to the navigation system and automatically controlling driving of the vehicle through a control unit according to the determined route that is output to the navigation system.
| 11. An apparatus for weather support, comprising:
interface circuitry configured to transmit messages within the apparatus, and between the apparatus and external devices; and
processing circuitry configured to
receive, via the interface circuitry, a request from a user to check weather information, identify an expected location and expected time of a trip provided by the request, retrieve the weather information corresponding to the trip, and output the weather information and trip suggestions in response to the request;
capture, via the interface circuitry, user information and trip information of the trip, retrieve the weather information corresponding to the trip, output the weather information corresponding to the trip, and provide trip suggestions to the user according to the user information, the trip information and the weather information; and
retrieve, via interface circuitry, the trip information of the trip, identify a plurality of available routes associated with the trip, retrieve respective weather information of each of the plurality of available routes, retrieve respective physical information of each of the plurality of available routes, retrieve respective traffic information of each of the plurality of available routes, determine a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and output the determined route for the trip.
| 12. The apparatus of claim 11, wherein the processing circuitry is further configured to:
receive the request from the user through a microphone;
identify the expected location and the expected time of the trip provided by the request through a voice recognition technique;
retrieve the weather information corresponding to the trip through at least one of a weather web site, an application, and/or a data sore;
output the weather information in response to the request through a speaker or a display device;
provide the trip suggestions according to the weather information through the speaker or the display device.
| 13. The apparatus of claim 12, wherein the processing circuitry is further configured to:
retrieve the weather information on a specific day or at specific time;
retrieve the weather information for a duration of days or a period of time;
retrieve the weather information for a specific location; and
retrieve current weather information at a current location.
| 14. The apparatus of claim 12, wherein the processing circuitry is further configured to:
provide suggestions on trip supplies; and
provide suggestions on trip safety.
| 15. The apparatus of claim 11, wherein the processing circuitry is further configured to:
capture the user information of the user through a camera, the user information including trip supplies that the user prepares for the trip and dress that the user wears;
capture the trip information through at least one of a navigation system installed in the vehicle, a portable communication device of the user, or the request input via a microphone by the user, the trip information including the expected location and the expected time of the trip;
retrieve the weather information corresponding to the trip through at least one of a weather web site, an application, and a data store;
output the weather information corresponding to the trip through a speaker or a display device; and
provide the trip suggestions to the user on the trip supplies and trip safety through the speaker or the display device according to the captured user information, the captured trip information and the retrieved weather information.
| 16. The apparatus of claim 15, wherein the processing circuitry is further configured to
train a machine learning algorithm based on the captured user information, and deploy the trained machine learning algorithm to identify similar user information in a future event.
| 17. The apparatus of claim 11, wherein the processing circuitry is further configured to
retrieve the trip information of the trip through at least one of a navigation system installed in the vehicle, a portable communication device of the user, or the request input via a microphone by the user;
identify the plurality of available routes associated with the trip through a map database;
retrieve the respective weather information of each of the plurality of available routes through at least one of a weather web site, an application, and a data store;
retrieve the respective physical information of each of the plurality of available routes through at least one of a vehicle to infrastructure system, a cloud-based system, an application, and a data store;
retrieve the respective traffic information of each of the plurality of available routes through at least one of a traffic system, a cloud-based system, an application, and a data store;
determine the route for the trip from the plurality of available routes based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes; and
output the route for the trip to at least one of a speaker, a display device, and a navigation system.
| 18. The apparatus of claim 17, wherein the processing circuitry is further configured to:
select routes having no weather hazard or least weather hazards from the plurality of available routes based on the respective weather information of each of the plurality of available routes;
select routes having no traffic issues or fewest traffic issues from the routes that have no weather hazard or least weather hazards based on the respective traffic information of each of the plurality of available routes;
select routes having no physical issues or fewest physical issues from the routes that have no traffic issues or fewest traffic issues based on the respective physical information of each of the plurality of available routes; and
determine the route for the trip from the routes that has no physical issues or fewest physical issues based on at least one of total driving time, driving costs, or driving distance.
| 19. The apparatus of claim 17, wherein the processing circuitry is further configured to:
output the determined route for the trip to the navigation system and automatically control driving of the vehicle through a control unit according to the determined route that is output to the navigation system.
| 20. A non-transitory computer readable storage medium having instructions stored thereon that when executed by processing circuitry causes the processing circuitry to perform operations, the operations comprising:
receiving a request from a user to check weather information, identifying an expected location and expected time of a trip provided by the request, retrieving the weather information corresponding to the trip, and outputting the weather information and trip suggestions in response to the request;
capturing user information of the user and trip information of the trip, retrieving the weather information corresponding to the trip, outputting the weather information corresponding to the trip, and providing trip suggestions to the user according to the user information, the trip information and the weather information; and
retrieving the trip information of the trip, identifying a plurality of available routes associated with the trip, retrieving respective weather information of each of the plurality of available routes, retrieving respective physical information of each of the plurality of available routes, retrieving respective traffic information of each of the plurality of available routes, determining a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and outputting the determined route for the trip. | The method involves receiving a request to check the weather information by an interface circuitry of an apparatus, which is installed in a vehicle. The expected location and the expected time of a trip provided by the request are identified by a processing circuitry of the apparatus. The weather information corresponding to the trip is retrieved, where the weather information and the trip suggestions in response to the request are outputted. The user information and the trip information of the trip are captured by the interface circuitry, where the weather information corresponding to the trip is retrieved. The weather information corresponding to the trip is outputted, where the trip suggestions to a user is provided according to the user information. INDEPENDENT CLAIMS are included for the following:an apparatus with a processing circuitry; anda non-transitory computer readable storage medium for storing the instructions executed by a processing circuitry to perform the operations. Method for providing the weather support. The request to check the weather information is received by an interface circuitry of an apparatus, which is installed in a vehicle, and hence ensures providing the weather information to a user of a vehicle in an effective manner and provides the route selections based on the weather information, and also avoids the weather hazard, thus identifies the similar user information in a future event accurately and promptly. The drawing shows a schematic view of an apparatus for the weather support. 100Weather support apparatus100AInterface group100BProcessing group102Camera104Audio input device106Audio output device108Communication device |
Please summarize the input | Route modification to continue fully-autonomous drivingA system and method for route modification to continue fully-autonomous driving is provided. The method includes operating a vehicle in a Level 3 autonomous driving mode according to a determined route; collecting data in real time concerning the route ahead of the vehicle; based on the collected data, identifying areas of the route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode; modifying the route based on the identified areas to continue operating in the Level 3 autonomous driving mode; and operating the vehicle in a Level 3 autonomous driving mode according to the modified route.What is claimed is:
| 1. A vehicle comprising:
a processor; and
a non-transitory machine-readable storage medium encoded with instructions executable by the processor, the machine-readable storage medium comprising instructions to cause the processor to perform a method comprising:
operating the vehicle in a Level 3 autonomous driving mode according to a first route;
collecting data in real time concerning the first route ahead of the vehicle;
based on the collected data, identifying areas of the first route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed;
modifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and
operating the vehicle in a Level 3 autonomous driving mode according to the second route.
| 2. The vehicle of claim 1, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises:
identifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed.
| 3. The vehicle of claim 1, wherein the method further comprises:
modifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination.
| 4. The vehicle of claim 1, wherein the method further comprises:
presenting the second route to an occupant of the vehicle; and
modifying the first route to the second route only after receiving a confirmation of the second route from the occupant.
| 5. The vehicle of claim 1, wherein the method further comprises:
collecting the data in real time using at least one of sensors on the vehicle, communication with other vehicles, a map database, and a position of the vehicle.
| 6. The vehicle of claim 1, wherein the portion of the first lane of the first route from which the safe automatic stop of the vehicle cannot be performed comprises a travel lane with no shoulder.
| 7. The vehicle of claim 1, wherein the portion of the first lane of the first route from which the safe automatic stop of the vehicle cannot be performed comprises a construction zone.
| 8. The vehicle of claim 1, wherein the data collected in real time is received from a pre-mapped database.
| 9. The vehicle of claim 1, wherein the data collected in real time is received from vehicle-to-vehicle communication.
| 10. The vehicle of claim 1, wherein the data collected in real time is real-time traffic information.
| 11. A non-transitory machine-readable storage medium encoded with instructions executable by a hardware processor of a computing component of a vehicle, the machine-readable storage medium comprising instructions to cause the hardware processor to perform a method comprising:
operating the vehicle in a Level 3 autonomous driving mode according to a first route;
collecting data in real time concerning the first route ahead of the vehicle;
based on the collected data, identifying areas of the route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed;
modifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and
operating the vehicle in a Level 3 autonomous driving mode according to the second route.
| 12. The medium of claim 11, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises:
identifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed.
| 13. The medium of claim 11, wherein the method further comprises:
modifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination.
| 14. The medium of claim 11, wherein the method further comprises:
presenting the second route to an occupant of the vehicle; and
modifying the first route to the second route only after receiving a confirmation of the second route from the occupant.
| 15. The medium of claim 11, wherein the method further comprises:
collecting the data in real time using at least one of sensors on the vehicle, communication with other vehicles, a map database, and a position of the vehicle.
| 16. A method for operating a vehicle, the method comprising:
operating the vehicle in a Level 3 autonomous driving mode according to a first route;
collecting data in real time concerning the first route ahead of the vehicle;
based on the collected data, identifying areas of the first route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed;
modifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and
operating the vehicle in a Level 3 autonomous driving mode according to the second route.
| 17. The method of claim 16, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises:
identifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed.
| 18. The method of claim 16, further comprising:
modifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination.
| 19. The method of claim 16, further comprising:
presenting the second route to an occupant of the vehicle; and
modifying the first route to the second route only after receiving a confirmation of the second route from the occupant. | The vehicle (102) comprises a processor, and a non-transitory machine-readable storage medium is encoded with instructions executable by the processor, and the vehicle is operated in a level-3 autonomous driving mode according to a determined route. The data is collected in real time concerning the route ahead of the vehicle. Areas of the route ahead of the vehicle is identified that is require the vehicle to leave the level-3 autonomous driving mode based on the collected data. The route is modified based on the identified areas to continue operating in the level-3 autonomous driving mode. The vehicle is operated in a level-3 autonomous driving mode according to the modified route. INDEPENDENT CLAIMS are included for the following:a non-transitory machine-readable storage medium having stored instructions for implementing the method for operating a vehicle; anda method for operating a vehicle. Vehicle, such as autonomous vehicle. Driver can be ready to take full control of the vehicle as in manual mode. Route control circuit can receive information from multiple vehicle sensors to determine the route control mode should be activated. The drawing shows a block diagram of vehicle. 28Differential gear device30Axles32Crankshaft34Wheels102Vehicle |
Please summarize the input | Parking assistance control for vehicle with autonomous operation capabilityProvided is a method and device parking assistance for a vehicle capable of autonomous operation. The embodiment herein operates to receive a parking zone, which is based on a destination location and a user-defined parking parameter, and includes a plurality of parking locations. When on approach to the parking zone under an autonomous vehicle operation, the parking assistance determines whether the parking zone includes at least one parking location that is physically available for parking the vehicle. When the parking zone does not, the parking assistance prompts the vehicle to engage in a holding pattern for a predetermined period of time. While in the holding pattern under the autonomous operation, the parking assistance periodically determines whether the at least one parking location becomes available. When the predetermined period of time lapses, parking status of the vehicle is transmitted to a vehicle user.What is claimed is:
| 1. A method in a parking assistance control unit for a vehicle capable of autonomous operation, the method comprising:
receiving a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations;
when on approach to the parking zone under the autonomous operation, determining whether the parking zone includes at least one parking location of the plurality of parking locations that is physically available for parking the vehicle;
when the parking zone does not include the at least one parking location that is physically available for parking the vehicle:
prompting the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state;
while in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available;
when the predetermined period of time lapses, transmitting a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available.
| 2. The method of claim 1, wherein the determining whether the parking zone includes at least one parking location that is physically available for parking the vehicle further comprises at least one of:
receiving a vehicle-to-vehicle communications relating to the at least one parking location;
receiving a vehicle-to-infrastructure communications relating to the at least one parking location; and
sensing through a plurality of sensor devices at a periphery of the vehicle relating to the at least one parking location.
| 3. The method of claim 1, wherein the holding pattern comprises:
a combinational stationary and dynamic holding pattern.
| 4. The method of claim 1, wherein the holding pattern is based on traffic trends related to the parking zone.
| 5. The method of claim 4, wherein the traffic trends include at least one of:
a traffic density assessment; and
a local venue activity assessment.
| 6. The method of claim 1, wherein the holding pattern is based on a vehicle restriction, the vehicle restriction including at least one of:
a vehicle range limitation; and
a maneuverability limitation of the vehicle.
| 7. The method of claim 1, further comprising:
determining whether the predetermined period of time has elapsed; and
when the period of time has elapsed:
receiving a secondary parking zone based on the destination location and the user-defined parking parameter;
when on approach to the secondary parking zone under the autonomous operation, determining whether the secondary parking zone includes at least one parking location that is physically available for parking the vehicle; and
when the secondary parking zone does not include the at least one parking location:
prompting the vehicle to engage in another holding pattern for another period of time;
while in the holding pattern under the autonomous operation, periodically determining whether the secondary parking zone includes the at least one parking location or another at least one parking location; and
transmitting another parking status of the vehicle.
| 8. The method of claim 1, wherein the parking zone is defined by a graphic user interface of the vehicle.
| 9. A parking assistance control unit for a vehicle comprising:
a wireless communication interface to service communication with user equipment of a vehicle user;
a plurality of sensor devices disposable about the vehicle;
one or more processors coupled to the wireless communication interface and in communication with the plurality of sensor devices, the one or more processors for controlling operations of the parking assistance control unit;
a memory coupled to the one or more processors, the memory for storing data and program instructions used by the one or more processors, wherein the one or more processors are configured to execute instructions stored in the memory to:
receive a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations;
when on approach to the parking zone under the autonomous operation, determine whether the parking zone includes at least one parking location being physically available for parking the vehicle via sensor data of the plurality of sensor devices; and
when the parking zone does not include the at least one parking location that is physically available for parking the vehicle:
prompt the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state;
while in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available; and
when the predetermined period of time lapses, transmit via the wireless communication interface to the user equipment of the vehicle user a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available.
| 10. The parking assistance control unit of claim 9, wherein the one or more processors are further configured to execute further instructions stored in the memory to determine whether the parking zone includes at least one parking location that is physically available for parking the vehicle further comprising:
vehicle-to-vehicle communication relating to the at least one parking location; and
vehicle-to-infrastructure communication relating to the at least one parking location.
| 11. The parking assistance control unit of claim 10, wherein the holding pattern comprises:
a combinational stationary and dynamic holding pattern.
| 12. The parking assistance control unit of claim 10, wherein the holding pattern is based on traffic trends related to the parking zone.
| 13. The parking assistance control unit of claim 12, wherein the traffic trends include at least one of:
a traffic density assessment; and
a local venue activity assessment.
| 14. The parking assistance control unit of claim 10, wherein the holding pattern is based on a vehicle restriction, the vehicle restriction including at least one of:
a vehicle range limitation; and
a maneuverability limitation of the vehicle.
| 15. The parking assistance control unit of claim 10, wherein the one or more processors are further configured to execute further instructions stored in the memory to:
determine whether the predetermined period of time has elapsed; and
when the predetermined period of time has elapsed:
receive a secondary parking zone based on the destination location and the user-defined parking parameter;
when on approach to the secondary parking zone under the autonomous operation, determine whether the secondary parking zone includes at least one parking location that is physically available for parking the vehicle; and
when the secondary parking zone does not include the at least one parking location:
prompt the vehicle to engage in another holding pattern for another period of time;
while in the holding pattern under the autonomous operation, periodically determine whether the secondary parking zone includes the at least one parking location or another at least one parking location; and
transmit another parking status of the vehicle.
| 16. The parking assistance control unit of claim 9, wherein the parking zone is defined by a graphic user interface of the vehicle.
| 17. A parking assistance device comprising:
one or more sensor devices disposable about a vehicle, the one or more sensor devices configured to monitor surroundings relative to the vehicle;
a wireless communication interface operable to service communications; and
a computing device coupled to the wireless communication interface and in communication with the one or more sensor devices, the computing device including:
one or more processors, the one or more processors for controlling operations of the parking assistance device;
a memory coupled to the one or more processors, the memory for storing data and program instructions used by the one or more processors, wherein the one or more processors are configured to execute instructions stored in the memory to:
receive a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations;
when on approach to the parking zone under the autonomous operation, determine whether the parking zone includes at least one parking location that is physically available for parking the vehicle via sensor data of the one or more sensor devices; and
when the parking zone does not include the at least one parking location that is physically available for parking the vehicle:
prompt the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state;
while in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available; and
when the predetermined period of time lapses, transmit via the wireless communication interface, a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available.
| 18. The parking assistance device of claim 17, wherein the holding pattern comprises:
a combinational stationary and dynamic holding pattern.
| 19. The parking assistance device of claim 17, wherein the holding pattern is based on traffic trends related to the parking zone.
| 20. The parking assistance device of claim 19, wherein the traffic trends include at least one of:
a traffic density assessment; and
a local venue activity assessment. | The method involves receiving a parking zone based on a destination location and a user-defined parking parameter, where the parking zone is provided with multiple parking locations. The parking zone provided at a parking location from multiple parking locations physically available for parking a vehicle (100) is determined while providing an approach to the parking zone under an autonomous operation. The vehicle is prompted to engage in a holding pattern for a predetermined period of time when the parking zone is not provided with the parking location. INDEPENDENT CLAIMS are also included for the following:a parking assistance control unit with a wireless communication interface; anda parking assistance device with a computing device. Method for a parking assistance control unit (Claimed) for a vehicle for an autonomous operation. The parking zone is received based on a destination location and a user-defined parking parameter, where the parking zone is provided with multiple parking locations, and hence ensures improving the positional accuracy with the autonomous parking features and permits the flexibility to a vehicle user in an event in an easy manner. The drawing shows a schematic view of a vehicle with a parking assistance control unit. 100Vehicle102,104Sensor devices106a,106bVideo sensor devices200Parking assistance control unit220Antenna |
Please summarize the input | SYSTEMS AND METHODS FOR DYNAMIC ROAD SIGN PERSONALIZATIONSystems and methods are provided for presenting personalized information to one or more vehicle occupants as a vehicle approaches and/or passes a road sign. Characteristics of a vehicle, e.g., operating conditions, characteristics of the one or more vehicle occupants, e.g., age, demographic, driving record, purchase history, etc. and/or road conditions, e.g., weather-related road conditions, current traffic conditions, etc., may be obtained. Information relevant to the one or more vehicle occupants based on one or more of these characteristics/conditions may presented to the one or more vehicle occupants. In scenarios where the presentation of personalized information is unwanted/unwarranted, more generalized information and/or safety warnings or recommendations can be presented.What is claimed is:
| 1. A method comprising:
detecting proximity of a vehicle to a road sign;
obtaining data regarding at least one of operating characteristics of the vehicle and characteristics of one or more occupants of the vehicle;
determining current traffic conditions proximate to the road sign;
upon a determination that personalized information should not be presented via the road sign based on the current traffic conditions, presenting generalized information on the road sign; and
upon a determination that personalized information should be presented via the road sign, presenting the personalized information on the road sign.
| 2. The method of claim 1, wherein obtaining data regarding the operating characteristics of the vehicle comprise communicating with the at least one of an electronic control unit and one or more sensors of the vehicle over a vehicle-to-infrastructure communications channel to obtain the operating characteristics of the vehicle.
| 3. The method of claim 1, wherein the operating characteristics of the vehicle comprise at least one of vehicle speed, vehicle acceleration, travel origin, travel destination, lane change history, and continuous operation time.
| 4. The method of claim 1, wherein obtaining data regarding characteristics of one or more occupants of the vehicle comprises receiving at least one of biometric sensor signals from biometric sensors associated with the vehicle, obtaining vehicle associated account information regarding the one or more occupants of the vehicle, and obtaining information suggestive of at least one of the one or more occupants' travel preferences, travel history, purchase preferences, purchase history, and demographic information associated with the one or more occupants of the vehicle.
| 5. The method of claim 1, wherein the current traffic conditions upon which the determination that personalized information should not be presented comprises traffic conditions that result in non-intended recipients of the personalized information receiving the personalized information.
| 6. The method of claiml, wherein the generalized information is based upon at least one of operating characteristics of a group of vehicles proximate to the road sign, common characteristics of one or more occupants of each vehicle in the group of vehicles, and road conditions data applicable to the group of vehicles.
| 7. The method of claim 1, further comprising prior to presenting the personalized information on the road side, determining to present the personalized information on another road sign, based upon a change in at least one of visibility of the road sign, speed of travel of the vehicle, direction of travel of the vehicle, and lane change.
| 8. A method comprising:
detecting proximity of a vehicle to a digital billboard;
obtaining operating characteristics of the vehicle;
obtaining characteristics of at least one occupant of the vehicle;
determining at least a current speed of travel of the vehicle;
estimating whether first, targeted media for presentation on the digital billboard selected based upon the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle could be generated and viewed by the at least one occupant based upon the current speed of travel of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant, presenting second, generalized media on the digital billboard based upon operational characteristics of the vehicle.
| 9. The method of claim 8, wherein the at least one occupant is a driver of the vehicle.
| 10. The method of claim 9, wherein the second, generalized media comprises at least one of a safety warning and a safe driving recommendation based upon the characteristics of the driver.
| 11. The method of claim 8, further comprising:
detecting proximity of at least one other vehicle to the digital billboard;
obtaining operating characteristics of the least one other vehicle;
obtaining characteristics of at least one additional occupant of the at least one other vehicle;
determining at least a current speed of travel of the at least one other vehicle;
estimating whether the first, targeted media for presentation on the digital billboard could be generated and viewed by the at least one occupant of each of the vehicles proximate to the digital billboard based upon the current speeds of travel of each of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant of each of the vehicles proximate to the digital billboard, presenting second, generalized media on the digital billboard based upon aggregate operational characteristics of each of the vehicles proximate to the digital billboard.
| 12. The method of claim 8, further comprising determining whether the vehicle has declined targeted media presentation, and upon a determination that the vehicle has declined targeted media presentation, controlling the digital billboard to generate at least one of a blank and non-informational presentation.
| 13. The method of claim 8, further comprising determining whether the obtained operating characteristics of the vehicle comprises a vehicle setting opting out of targeted media for presentation on the digital billboard.
| 14. A system, comprising:
at least one processor; and
at least one memory unit operatively connected to the processor, the at least one memory unit having stored thereon, at least one computer program comprising computer code causing the at least one processor to perform the following:
obtain operating characteristics of a vehicle detected as being proximate to a digital billboard;
obtain characteristics of at least one occupant of the vehicle;
determine at least a current speed of travel of the vehicle;
transmit the obtained operating characteristics of the vehicle, the characteristics of the at least one occupant, and the current speed of travel of the vehicle to a controller controlling operation of the digital billboard, the controller adapted to estimate whether first, targeted media for presentation on the digital billboard selected based upon the characteristics of the at least one occupant and the operating characteristics of the vehicle could be generated and viewed by the at least one occupant based upon the current speed of travel of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant, controlling the digital billboard to present second, generalized media on the digital billboard based upon operational characteristics of the vehicle.
| 15. The system of claim 14, wherein at least one of the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle are obtained by a vehicle-to-infrastructure (V2I) capable roadside unit.
| 16. The system of claim 15, wherein the V2I capable roadside unit obtains the at least one of the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle from at least one of an electronic control unit of the vehicle, one or more sensors implemented in or associated with the vehicle, and a database in which at least one aspect of the characteristics of the at least one occupant are stored.
| 17. The system of claim 14, wherein the controller controlling operation of the digital billboard is at least one of remotely located from the digital billboard and co-located with the digital billboard.
| 18. The system of claim 14, wherein the vehicle comprises an autonomous vehicle, and wherein the controller controlling operation of the digital billboard initiates transmission of a speed override command to the autonomous vehicle to reduce its speed to a speed allowing the targeted media to be viewed by the at least one occupant. | The method involves detecting proximity of a vehicle (110) to a road sign (104). Data regarding one of operating characteristics of the vehicle and characteristics of occupants of the vehicle is obtained. Current traffic conditions proximate to the road sign are determined. Generalized information is presented on the road sign upon a determination that personalized information is not be presented through the road sign based on the current traffic conditions. The personalized information is presented on the road sign upon a determination that personalized information is presented through the road sign. An INDEPENDENT CLAIM is also included for a system for obtaining operating characteristics of vehicles and/or occupants proximate to a road sign. Method for obtaining operating characteristics of vehicles and/or occupants proximate to a road sign i.e. digital billboard. Uses include but are not limited to operating conditions such as weather-related road conditions and current traffic conditions and/or age, demographic, driving record and purchase history. The method enables providing redundancy and/or multiple sources of information that is compared or used as a way to verify a validity of received information, thus increasing accuracy of information. The drawing shows a schematic view of a driving scenario. 100Roadway102Roadside unit104Road sign106Roadside unit110,114Vehicles |
Please summarize the input | RESOLVING VEHICLE APPLICATION VERSION DIFFERENCESThe disclosure includes embodiments for resolving vehicle application version differences for connected vehicles. A method includes determining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying version differences in a common vehicle application installed in both the ego vehicle and the remote connected vehicle. The method includes forming a vehicular micro cloud. The method includes determining a maximum possible functionality of the common vehicle application. The method includes determining a set of tasks to be completed to achieve the maximum possible functionality. The method includes assigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle. The method includes using the vehicular micro cloud to cause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset.What is claimed is:
| 1. A method executed by a processor of an ego vehicle, the method comprising:
determining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle;
forming a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified;
determining, by a processor of the ego vehicle, a maximum possible functionality of the common vehicle application;
determining a set of tasks to be completed to achieve the maximum possible functionality;
assigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and
using the vehicular micro cloud to cause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved.
| 2. The method of claim 1, wherein the common vehicle application includes a vehicle control system.
| 3. The method of claim 1, wherein the common vehicle application includes an autonomous driving system.
| 4. The method of claim 1, wherein the common vehicle application includes an Advanced Driver Assistance System.
| 5. The method of claim 1, wherein the maximum possible functionality includes the functionality of a most recently released version of the common vehicle application.
| 6. The method of claim 1, wherein the maximum possible functionality includes less functionality than a most recently released version of the common vehicle application.
| 7. The method of claim 1, wherein the vehicular micro cloud provides functionality that benefits each member of the vehicular micro cloud.
| 8. The method of claim 1, further comprising parsing V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications.
| 9. The method of claim 8, wherein the common vehicle application is determined based on comparison of the remote application data and the ego application data.
| 10. The method of claim 8, wherein the V2X data includes member data describing a first hardware capability of the remote vehicle.
| 11. The method of claim 10, wherein the maximum possible functionality of the common vehicle application is determined by the processor of the ego vehicle based on one or more of: the first hardware capability of the remote vehicle; a second hardware capability of the ego vehicle; the first version information; and the second version information.
| 12. The method of claim 1, wherein the vehicular micro cloud includes each member of the vehicular micro cloud sharing their unused computing resources with one another to complete the set of tasks for a benefit of each member of the vehicular micro cloud.
| 13. The method of claim 12, wherein the benefit includes reducing a risk of a collision by resolving vehicle application version differences among members of the vehicular micro cloud.
| 14. A system of an ego vehicle comprising:
a communication unit;
a non-transitory memory;
and a processor communicatively coupled to the communication unit and the non-transitory memory, wherein the non-transitory memory stores computer readable code that is operable, when executed by the processor, to cause the processor to execute steps including:
determining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle;
forming a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified;
determining, by a processor of the ego vehicle, a maximum possible functionality of the common vehicle application;
determining a set of tasks to be completed to achieve the maximum possible functionality;
assigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and
causing the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved.
| 15. The system of claim 14, further comprising parsing V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications.
| 16. The system of claim 15, wherein the common vehicle application is determined based on comparison of the remote application data and the ego application data.
| 17. The system of claim 15, wherein the V2X data includes member data describing a first hardware capability of the remote vehicle.
| 18. The system of claim 17, wherein the maximum possible functionality of the common vehicle application is determined by the processor of the ego vehicle based on one or more of: the first hardware capability of the remote vehicle; a second hardware capability of the ego vehicle; the first version information; and the second version information.
| 19. A computer program product of an ego vehicle including computer code stored on a non-transitory memory that is operable, when executed by an onboard vehicle computer of the ego vehicle, to cause the onboard vehicle computer to execute operations including:
determine a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle;
form a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified;
determine a maximum possible functionality of the common vehicle application;
determine a set of tasks to be completed to achieve the maximum possible functionality;
assign a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and
cause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved.
| 20. The computer program product of claim 19, wherein the non-transitory memory stores additional computer code that is operable, when executed by the onboard vehicle computer, to cause the onboard vehicle computer to execute additional operations including:
parse V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications; and
compare the remote application data and the ego application data to identify the common vehicle application based on the comparison. | The method involves determining a common vehicle application that is installed in both of a remote connected vehicle (124) and an ego vehicle. A vehicular micro cloud (194) including the ego vehicle and the remote vehicle is formed responsive to the difference being identified. A maximum possible functionality of the application is determined. A set of tasks to be completed is determined to achieve the functionality. Two subsets of the tasks are assigned to the vehicles, respectively. The micro cloud is used to cause the vehicles to complete the subsets, so that the functionality is achieved by a processor of the vehicle. INDEPENDENT CLAIMS are included for:(1) A system of an ego vehicle comprising a communication unit.(2) A computer program product of an ego vehicle including computer code stored on a non-transitory memory that is operable to cause the onboard vehicle computer to execute operations for resolving vehicle application version differences for connected vehicles. Method for resolving vehicle application version differences for connected vehicles. The product enables resolving vehicle application version differences for connected vehicles in an effective manner. The product allows the ego vehicle to complete the subset and the remote connected vehicle to perform the subset, so that the maximum possible functionality of the common application is achieved in an efficient manner. The drawing shows a schematic block diagram of an operating environment for a resolver system.103Server 124Remote connected vehicle 150Standard-compliant GPS unit 194Vehicular micro cloud 199Resolver system |
Please summarize the input | SYSTEMS AND METHODS FOR VEHICULAR-NETWORK-ASSISTED FEDERATED MACHINE LEARNINGSystems and methods for vehicular-network-assisted federated machine learning are disclosed herein. One embodiment transmits first metadata from a connected vehicle to at least one other connected vehicle; receives, at the connected vehicle, second metadata from the at least one other connected vehicle; receives, at the connected vehicle based on analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in the current training phase of a federated machine learning process; receives, at the connected vehicle, instructions to prepare the connected vehicle for the next training phase; trains a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and submits the locally trained machine learning model for aggregation with at least one other locally trained machine learning model to produce an aggregated locally trained machine learning model.What is claimed is:
| 1. A system for vehicular-network-assisted federated machine learning, the system comprising:
one or more processors; and
a memory communicably coupled to the one or more processors and storing:
a federated learning module including instructions that when executed by the one or more processors cause the one or more processors to:
transmit, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle;
receive, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle;
receive, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process; and
receive, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process;
a training module including instructions that when executed by the one or more processors cause the one or more processors to train a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and
an aggregation module including instructions that when executed by the one or more processors cause the one or more processors to submit the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model.
| 2. The system of claim 1, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.
| 3. The system of claim 1, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.
| 4. The system of claim 1, wherein the connected vehicle receives the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process through coordination among a cloud leader of the vehicular micro cloud and at least one other cloud leader of a different vehicular micro cloud within a same vehicular macro cloud.
| 5. The system of claim 1, wherein the analysis of the first and second metadata is performed at one of a cloud server and an edge server, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.
| 6. The system of claim 1, wherein the analysis of the first and second metadata is performed at the connected vehicle and the at least one other connected vehicle in a distributed fashion, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.
| 7. The system of claim 1, wherein the connected vehicle and the at least one other connected vehicle communicate via one or more vehicle-to-vehicle (V2V) communication links.
| 8. The system of claim 1, wherein the aggregated locally trained machine learning model is further aggregated with at least one other aggregated locally trained machine learning model from another vehicular micro cloud at one of a cloud server and an edge server in preparation for the next training phase of the federated machine learning process.
| 9. The system of claim 1, wherein the connected vehicle is an autonomous vehicle.
| 10. A non-transitory computer-readable medium for vehicular-network-assisted federated machine learning and storing instructions that when executed by one or more processors cause the one or more processors to:
transmit, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle;
receive, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle;
receive, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process;
receive, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process;
train a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and
submit the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model.
| 11. The non-transitory computer-readable medium of claim 10, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.
| 12. The non-transitory computer-readable medium of claim 10, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.
| 13. A method of vehicular-network-assisted federated machine learning, the method comprising:
transmitting, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle;
receiving, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle;
receiving, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process;
receiving, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process;
training a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and
submitting the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model.
| 14. The method of claim 13, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.
| 15. The method of claim 13, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.
| 16. The method of claim 13, wherein the analysis of the first and second metadata is performed at one of a cloud server and an edge server, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.
| 17. The method of claim 13, wherein the analysis of the first and second metadata is performed at the connected vehicle and the at least one other connected vehicle in a distributed fashion, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.
| 18. The method of claim 13, wherein the connected vehicle and the at least one other connected vehicle communicate via one or more vehicle-to-vehicle (V2V) communication links.
| 19. The method of claim 13, wherein the task is one of object detection, object recognition, and trajectory prediction.
| 20. The method of claim 13, wherein the aggregated locally trained machine learning model is further aggregated with at least one other aggregated locally trained machine learning model from another vehicular micro cloud at one of a cloud server and an edge server in preparation for the next training phase of the federated machine learning process. | The system has a memory communicably coupled to processors (110) and storing a federated learning module (170) including instructions that when executed by the processors cause the processors to transmit metadata from a connected vehicle (180) i.e. autonomous vehicle, to another connected vehicle after initiation of federated machine learning process. The processors receive another metadata from the latter connected vehicle at the former connected vehicle, where the latter metadata includes information regarding sensor capabilities of the latter vehicle. An aggregation module submits a locally trained machine learning model for aggregation with another locally trained model produced by the latter elected vehicle in a vehicular micro cloud to produce an aggregated locally trained system learning model. INDEPENDENT CLAIMS are included for the following: Non-transitory computer-readable medium for vehicular-network-assisted federated machine learning and storing instructions Method of vehicular-network-assisted federated machine learning System for vehicular-network-assisted federated machine learning used in autonomous vehicles and intelligent driving-assistance systems for recognizing objects from red, green, blue (RGB) images or to predict trajectory of external road agents such as vehicles, cyclists, pedestrians, etc. The method enables training a machine-learning model to optimize parameters to maximize accuracy of predictions. The method allows a cloud computing environment to offer infrastructure, platforms and/or software as services for which a cloud consumer does not need to maintain resources on a local computing device. The drawing shows a block diagram of the connected vehicle.110Processors 170Federated learning module 180Connected vehicle 185Server 190Network |
Please summarize the input | V2V CHARGE MAPPINGA system for electrical charging of a first vehicle by a second vehicle includes a network access device to communicate with a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle. The system further includes a processor coupled to the network access device that is designed to receive a charge request from the first source via the network access device, the charge request requesting access to a source of electrical energy for charging the first vehicle. The processor is further designed to identify an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle. The processor is further designed to control the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request.What is claimed is:
| 1. A system for electrical charging of a first vehicle by a second vehicle, the system comprising:
a network access device configured to communicate with a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle; and
a processor coupled to the network access device and configured to:
receive a charge request from the first source via the network access device, the charge request requesting access to a source of electrical energy for charging the first vehicle,
identify an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle, and
control the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request.
| 2. The system of claim 1 wherein the available vehicle information includes a location of the available vehicle.
| 3. The system of claim 1 wherein the processor is further configured to:
determine digital key information corresponding to the available vehicle; and
transmit the digital key information to the first source to allow access to a charge port of the available vehicle.
| 4. The system of claim 1 wherein the processor is further configured to:
determine digital key information corresponding to the first vehicle; and
transmit digital key information to the available vehicle or to a second mobile device associated with a second user of the available vehicle to allow access to a charge port of the first vehicle.
| 5. The system of claim 1 wherein the processor is further configured to:
identify a plurality of available vehicles including the available vehicle;
control the network access device to transmit additional available vehicle data corresponding to each of the plurality of available vehicles; and
receive a selection of the available vehicle from the first source.
| 6. The system of claim 1 wherein the processor is further configured to:
identify a plurality of available vehicles including the available vehicle; and
select the available vehicle from the plurality of available vehicles based on at least one of:
a current location of the first vehicle and current locations of each of the plurality of available vehicles;
a state of charge (SOC) of each of the plurality of available vehicles; or
a current route of the first vehicle.
| 7. The system of claim 1 wherein:
the available vehicle is autonomous or semi-autonomous; and
the processor is further configured to control the network access device to at least one of navigation instructions from a current location of the available vehicle to a current location of the first vehicle, or provide the current location of the first vehicle to the available vehicle such that the second vehicle can autonomously travel to the current location of the first vehicle to charge the first vehicle.
| 8. The system of claim 1 wherein at least one of the first vehicle or the available vehicle is autonomous or semi-autonomous, and includes at least one of a wireless charging port configured to wirelessly transmit or receive the electrical energy, or an actuator configured to actuate a charging contact to make contact with a corresponding charging contact of the other of the first vehicle or the available vehicle.
| 9. The system of claim 1 further comprising a memory configured to store preferences for each of a plurality of available vehicles including the available vehicle, the preferences including at least one of a minimum state of charge (SOC) corresponding to a minimum SOC for each of the plurality of available vehicles, geographical area information corresponding to a geographical area which each of the plurality of available vehicles is available to travel to charge the first vehicle, time information corresponding to times at which each of the plurality of available vehicles is available to charge the first vehicle, or cost information corresponding an amount of payment that each of the plurality of available vehicles will pay for each unit of the electrical energy, wherein the processor is further configured to identify the available vehicle from the plurality of available vehicles based on the stored preferences.
| 10. The system of claim 1 further comprising a memory configured to store account information including at least one of payment information corresponding to the user of the first vehicle or loyalty points corresponding to the user of the first vehicle, wherein the processor is further configured to receive, via the network access device, quantity information corresponding to a quantity of the electrical energy provided from the available vehicle to the first vehicle, and to collect payment for the quantity of the electrical energy using the account information.
| 11. The system of claim 1 wherein the available vehicle is an autonomous vehicle and includes at least one of a fuel cell circuit configured to generate the electrical energy via a chemical reaction or a generator configured to convert fuel into the electrical energy such that the available vehicle is capable of providing a greater quantity of electrical energy than is stored in an available vehicle battery of the available vehicle.
| 12. A system for electrical charging of a first vehicle, the system comprising:
a battery having a state of charge (SOC) and configured to receive electrical energy;
a network access device configured to communicate with a remote server; and
an electronic control unit (ECU) coupled to the battery and the network access device and configured to transmit a charge request to the remote server requesting access to a source of electrical energy for charging the battery, and to receive available vehicle information corresponding to an available vehicle that is available to be used as the source of electrical energy for charging the battery.
| 13. The system of claim 12 wherein the available vehicle information includes a location of the available vehicle along with digital key information corresponding to the available vehicle and usable to provide access to a charge port of the available vehicle.
| 14. The system of claim 12 wherein the ECU is further configured to:
receive additional available vehicle data corresponding to each of a plurality of available vehicles via the network access device;
identify a preferred available vehicle based on at least one of:
a current location of the first vehicle and current locations of each of the plurality of available vehicles;
a state of charge (SOC) of each of the plurality of available vehicles; or
a current route of the first vehicle; and
control the network access device to transmit an identifier of the preferred available vehicle to the remote server.
| 15. The system of claim 12 wherein the available vehicle information includes at least one of a location of the available vehicle or navigation instructions from a current location of the first vehicle to the location of the available vehicle, wherein the ECU is further configured to autonomously control the first vehicle to travel to the location of the available vehicle based on the available vehicle information.
| 16. The system of claim 15 further comprising at least one of a wireless charge port configured to receive a wireless electrical energy signal, or a charging contact and a robot arm coupled to the charging contact and configured to actuate the charging contact to make contact with an external charging contact, wherein the ECU is further configured to control the at least one of the wireless charge port or the robot arm to be positioned in a way to receive the electrical energy from the available vehicle.
| 17. A method for electrical charging of a first vehicle by a second vehicle, the method comprising:
receiving, by a network access device and from a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle, a charge request requesting access to a source of electrical energy for charging the first vehicle;
identifying, by a processor, an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle; and
controlling, by the processor, the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request.
| 18. The method of claim 17 further comprising:
determining, by the processor, digital key information corresponding to the available vehicle; and
transmitting, by the processor, the digital key information to the first source to allow access to a charge port of the available vehicle.
| 19. The method of claim 17 further comprising:
identifying, by the processor, a plurality of available vehicles including the available vehicle;
controlling, by the processor, the network access device to transmit additional available vehicle data corresponding to each of the plurality of available vehicles; and
receiving, by the processor, a selection of the available vehicle from the first source.
| 20. The method of claim 17 further comprising:
identifying, by the processor, a plurality of available vehicles including the available vehicle; and
selecting, by the processor, the available vehicle from the plurality of available vehicles based on at least one of:
a current location of the first vehicle and current locations of each of the plurality of available vehicles;
a state of charge (SOC) of each of the plurality of available vehicles; or
a current route of the first vehicle. | The system has a network access device (8) for communicating with a source that includes a first vehicle (10) or a mobile device associated with a user of the first vehicle. A processor (4) is connected to the network access device to receive a charge request (18) from the source through the network access device, identify an available vehicle that is available to be used as the source of energy, and control the network access device to transmit available vehicle information corresponding to the available vehicle to the source in response to receiving the charge request, where the charge request requesting access to the source of electrical energy for charging the first vehicle. An INDEPENDENT CLAIM is included for a method for electrical charging of first vehicle by second vehicle. System for electrical charging of a first vehicle e.g. car, a bus, a motorcycle, a boat, an aircraft, by a second vehicle. The system increases availability of sources of electricity usable to charge electric vehicles in an effective manner. The system allows a user to select the available vehicle to be used as the source of electrical energy for charging the electric vehicle in an efficient manner. The drawing shows a block diagram of a system for electric charging of a first vehicle by a second vehicle.4Processor 6Memory 8Network access device 10First vehicle 18Charge request |
Please summarize the input | A DEVICE FOR DETECTION AND PREVENTION OF AN ATTACK ON A VEHICLEA new device for the detection and prevention of an attack on a vehicle, via the vehicle's communication channels, includes sensors, network-based services, navigation applications, electronic control units of vehicles, bus-networks of the vehicle, sub of the vehicle. an input-unit configured to collect real-time and/or offline data from various sources, such as systems, and on board diagnostics (OBD); a database for storing data; a detection-unit in communication with the input-unit; and an action-unit in communication with the detection unit, configured to send an alert over the communication channels and/or prevent an attack by stopping or changing the attacked communication channels. The detection-unit is configured to simultaneously monitor the content of the data, the meta-data, and the physical-data, and detect an attack.|1. A device for executing instructions for in-vehicle attack detection and prevention, the device comprising at least one hardware processor, the at least one hardware processor comprising:
code instructions for collecting real-time data from one or more data sources of the vehicle;
code instructions for analyzing the real-time data to detect at least two irregularities; and code instructions to perform at least one of sending an alert when the at least two irregularities are detected and preventing at least one attack, wherein the at least two irregularities include:
irregularities between data received from a vehicle sensor and operation maintained by one of a plurality of subsystems of the vehicle;
irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; and an irregular cellular provider signal or irregular alteration of a cellular provider signal of the source of the real-time data;
wherein the code instructions are stored on a non-transitory computer readable medium.
| 2. The device of claim 1 , further comprising a database comprising parameters of at least one known attack previously detected as the attack, wherein the detection is performed according to the parameters.
| 3. The device of claim 2, wherein the device further comprises an interface for communicating with a remote server to perform at least one of forwarding parameters of the known attack and receiving parameters of the known attack.
| 4. The device of claim 1 , wherein the at least one hardware processor is at least partially embedded within a hardware card of the vehicle.
| 5. The device of claim 2 , wherein the at least one hardware processor executes a machine learning engine configured to calculate an unfamiliar behavior of the vehicle based on analysis of the real-time data, wherein the parameters are updated or calculated according to the unfamiliar behavior. .
| 6. The method of claim 1 , wherein one of the plurality of subsystems comprises:
i. tire pressure monitoring;
ii. stability control;
iii. cruise control;
iv. airbag control;
v. PCM (Powertrain Control Module);
vi. Transmission Control Module (TCM);
vii. Brake Control Module (BCM);
viii. Central Control Module (CCM);
ix. Central Timing Module (CTM);
x. General Electronic Module (GEM);
xi. Body Control Module (BCM);
xii. Suspension Control Module (SCM);
xiii. Convenience Control Unit (CCU);
xiv. ECU (Engine Control Unit);
xv. Electrical Power Steering Control Unit (PSCU);
xvi. Human Machine Interface (HMI);
xvii. seat control unit;
xviii. speed control unit;
xix. Telephone Control Unit (TCU);
xx. Transmission Control Unit (TCU);
xxi. brake control module (ABS or ESC);
xxii. crash sensors;
xxiii. airbags;
xxiv. seat belts;
xxv. Tire Pressure Monitoring System (TPMS);
xxvi. Electronic Stability Control (ESC) systems;
xxvii. TCS (Traction Control System);
xxviii. anti lock braking system (ABS);
xx. Electronic Brake Assistance (EBA) systems;
xxx. Electronic braking force distribution unit;
xxxi. electronic brake force distribution (EBD) systems;
xxxii. emergency stop;
xxxiii. driven notifications and alerts;
xxxiv. Pedestrian object recognition;
xxxv. Lane keeping assistance;
xxxvi. Collision avoidance;
xxxvii. Adaptive headlamps control xxxviii. Reverse backup sensors xxxix. adaptive cruise control;
xl. Active Cruise Control (ACC);
xli. traction control systems;
xlii. electronic stability control;
xliii. Automated parking system;
xliv. multimedia;
xlv. Active noise cancellation (ANC);
xlvi. radio;
xlvii. Radio Data System (RDS);
xlviii. driver information functions;
xlix. AM/FM or satellite radio;
l. DC/DVD player;
li. payment systems;
li. in-vehicle Wi-Fi router;
liii. interior lights;
liv. climate control;
lv. Chairs adjustment;
lvi. electric windows;
lvii. mirror adjustment unit;
lviii. central lock;
lix. battery management unit;
lx. charge management unit;
lxi. vehicle-grid systems;
lxii. active cruise control (ACC);
lxiii. remote control keys;
lxiv. Theft deterrent systems;
lxv. Immobilizer system;
lxvi. security systems;
lxvii. digital cameras;
lxviii. night vision;
lxix. lasers;
lxx. Radar;
lxxi. RF sensors;
lxxii. infotainment system;
lxxiii. robotic gear-shaft; and lxxiv. any combination thereof;
A device selected from the group consisting of.
| 7. The method according to claim 1, wherein the vehicle sensor comprises:
a. distance sensor;
b. speed sensor; c. temperature Senser;
d. satellite transmission sensor;
e. cellular transmission sensors; f. video footage;
g. air-fuel ratio meters;
h. blind spot monitor;
i. crankshaft position sensor; j. curb sensors used to warn the driver of curbs;
k. anomaly detectors used to detect axle and signal problems of passing trains on railroads;
l. an engine coolant temperature (ECT) sensor used to measure engine temperature;
m. a Hall effect sensor used to time the speed of the wheels and shafts;
n. a manifold absolute pressure (MAP) sensor used to regulate fuel metering;
o. a mass flow sensor or mass airflow (MAF) sensor used to indicate the mass of air entering the engine;
p. an oxygen sensor used to monitor the amount of oxygen in the exhaust pipe of the vehicle;
q. parking sensors used to warn the driver of invisible obstacles during parking maneuvers;
r. a speed meter used to detect the speed of other objects; s. speedometers used to measure the instantaneous speed of land vehicles;
t. a speed sensor used to detect the speed of an object;
u. a throttle position sensor used to monitor the position of a throttle in an internal combustion engine;
v. a tire pressure monitoring sensor used to monitor the air pressure inside the tires;
w. a torque sensor or torque transducer or torque meter that measures torque (torsion force) on a rotating system; x. a transmission fluid temperature sensor used to measure the temperature of the transmission fluid;
y. a turbine speed sensor (TSS) or input speed sensor (ISS) used to measure the rotational speed of an input shaft or torque converter;
z. a variable reluctance sensor used to measure the position and velocity of moving metal components;
aa. a vehicle speed sensor (VSS) used to measure the speed of the vehicle;
bb. a moisture sensor or a moisture sensor in the fuel used to indicate the presence of moisture in the fuel;
cc. a wheel speed sensor used to read the wheel rotation speed of the vehicle;
dd. comfort sensors including seat position, seat row, air condition and occupant position; and ee. any combination thereof;
A device selected from the group consisting of.
| 8. The method of claim 1 , wherein at least one of an irregular source and an irregular destination of the real-time data comprises:
a. electronic/engine control module (ECM); b. powertrain control module (PCM); c. transmission control module (TCM);
d. brake control module (BCM);
e. Central Control Module (CCM);
f. Central Timing Module (CTM);
g. Generic Electronic Module (GEM);
h. body control module (BCM);
i. Suspension Control Module (SCM);
j. airbag control unit (ACU);
k. a body control module (BCU) for controlling door locks, automatic windows and interior lights;
l. Convenience Control Unit (CCU);
m. door control unit (DCU);
n. engine control unit;
o. an electric power steering control unit (PSCU) integrated into the electric power steering (EPS) power box;
p. human machine interface (HMI);
q. powertrain control module (PCM);
r. seat control unit;
s. speed control unit (SCU);
t. telephone control unit (TCU);
u. telematic control unit (TCU);
v. transmission control unit (TCU);
w. brake control module (BCM);
x. onboard or integrated ECU handling remote services; and y. any combination thereof;
A device, which is an Electronic Control Unit (ECU) selected from the group consisting of
| 9. The method of claim 1 , wherein the at least one hardware processor comprises:
A device running at least one of a commercially available antivirus, malware application, firewall or other malware database, at least a partially autonomous driving system, a remote control system, or a fully autonomous driving system.
| 10. The device of claim 1 , wherein the vehicle is driven by a robotic platform.
| 11. The device of claim 1 , wherein the vehicle moves over land, water, or air.
| 12. The device of claim 1 , further comprising an evaluation engine configured to prioritize the attack by evaluating a risk level of the attack on the vehicle and its occupants.
| 13. The code instructions of claim 1 , wherein the code instructions for performing at least one of sending the alert and preventing at least one attack include changing and blocking one or more communication channels connected to one or more network-based services. code instructions for performing at least one of: the one or more network-based services:
A device selected from the group consisting of web, physical cable, wifi, cellular, Bluetooth, RF, GPS, vehicle-to-vehicle communication, vehicle-to-passenger infrastructure, environment-to-vehicle infrastructure.
| 14. The device of claim 1 , wherein at least one of the irregular source and the irregular destination of the real-time data comprises one or more navigation applications or devices selected from the group consisting of a satellite navigator, a cellular navigator, and an inertial only navigator.
| 15. The method of claim 1 , wherein the one or more data sources include:
one or more sensors; one or more network-based services; one or more navigation applications or navigation devices; one or more electronic control units (ECUs) of the vehicle; one or more bus networks of the vehicle; one or more subsystems of the vehicle; and one or more onboard diagnostics (OBD);
A device selected from the group consisting of
| 16. A computer-implemented method by at least one hardware processor for in-vehicle attack detection and prevention, the method comprising:
collecting real-time data from one or more data sources of the vehicle;
analyzing, by the at least one hardware processor, the real-time data to detect at least two irregularities; and performing at least one of sending an alert and preventing at least one attack when the at least two irregularities are detected, wherein the at least two irregularities include:
irregularities between current vehicle operation and operation maintained by one of a plurality of subsystems of the vehicle;
irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; irregularity of the relationship between the current output of at least one sensor of the vehicle and the real-time data; and an irregular cellular provider signal or irregular change in a cellular provider signal of a source of instructions found in the real-time data;
selected from the group consisting of
| 17. A computer readable medium recording a computer program for in-vehicle attack detection and prevention, the computer program comprising program instructions, the program instructions being executable by a hardware processor of the vehicle:
collect real-time data from one or more data sources of the vehicle;
analyze the real-time data to detect at least two irregularities; and performing at least one of sending an alert and preventing at least one attack when the at least two irregularities are detected;
The at least two irregularities are:
irregularities between current vehicle operation and operation maintained by one of a plurality of subsystems of the vehicle;
irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; irregularity of the relationship between the current output of at least one sensor of the vehicle and the real-time data; and an irregular cellular provider signal or irregular change in a cellular provider signal of a source of instructions found in the real-time data;
A computer-readable medium selected from the group consisting of | The device has an input-unit (110) collecting real-time and/or offline data from a source. A database (120) stores the data. A detection-unit (130) is provided in communication with the input-unit. An action-unit (140) is provided in communication with the detection unit, and sends an alert through communication channels and/or prevents an attack on a vehicle by breaking or changing the attack through the communication channels. The detection-unit simultaneously monitors content, meta-data and physical-data of the data, and detects the attack. The source is selected from a group consisting of a vehicle's sensor (111), a network based service (112), a navigation application (113) or navigation device, a vehicle electronic control unit (ECU) (114), a vehicle's bus-network (115), a vehicle's subsystem (116), and a vehicle's on board diagnostic (OBD) (117). The detection-unit detects the attack based on a characteristic that is selected from a group consisting of the data content, irregular source of the data, irregular destination for the data, irrational data content when compared with data received by the source, irrational action of the subsystem when compared with data received by another subsystem, irrational action between the subsystems, irrational action of one of the subsystems when compared with the data received by one of sensors, irrational meta-data, irrational meta-content, jam or blockage of communication channels and/or the network-based services, and sudden change in signal features of the network-based services and/or sensors. The sensor is a distance sensor, velocity sensor, temperature sensor, satellite transmission sensor, cellular transmission sensor, video image, air-fuel ratio meter, blind spot monitor, crankshaft position sensor, curb feeler, defect detector used on railroads to detect axle and signal problems in passing trains, engine coolant temperature (ECT) sensor, hall effect sensor, manifold absolute pressure (MAP) sensor, mass flow sensor or mass airflow (MAF) sensor, oxygen sensor, parking sensors, radar gun, speedometer, speed sensor, throttle position sensor, tire-pressure monitoring sensor, torque sensor or torque transducer or torque-meter, transmission fluid temperature sensor, turbine speed sensor (TSS), or input speed sensor (ISS). The subsystem is selected from a group consisting of tire pressure monitoring, stability control, cruise control, airbag control, powertrain control module (PCM), transmission control module (TCM), brake control module (BCM), central control module (CCM), central timing module (CTM), and general electronic module (GEM). The network based services are selected from a group consisting of web, physical cable, Wi-Fi , cellular, bluetooth , RF, GPS, vehicle to vehicle communication, vehicle to passenger infrastructure, environment to vehicle infrastructure. The navigation applications or devices are selected from a group consisting of satellite navigator, cellular navigator and inertial dedicated navigator. An INDEPENDENT CLAIM is also included for a method for detecting and preventing an attack on a vehicle through communication channels. Device for detection and prevention of an attack e.g. cyber attack and communication attack, on a vehicle i.e. robotic platform (claimed), through communication channels. The device utilizes a cellular device for virus detection followed by sequence of irrational commands to the vehicle's sub-systems and identification of irrational communication characteristics followed by a sharp turn of the stining wheel while navigation system recognizes an untracked, a dangerous path or even collision. The drawing shows a block diagram of a device for detection and prevention of an attack on a vehicle and basic components. 110Input-unit111Vehicle's sensor112Network based service113Navigation application114Vehicle ECU115Vehicle's bus-network116Vehicle's subsystem117Vehicle's OBD120Database130Detection-unit140Action-unit |
Please summarize the input | AUTONOMOUS VEHICLE FOR TEMPORARILY POWERING ELECTRIC VEHICLES (EVS) ON THE ROADMethods and systems for charging an electric vehicle (EV) are described herein. An EV may require additional battery power to reach a charging station. A remote server in communication with the EV or an on-board computer or mobile device in the EV may obtain data to determine a location for the EV to meet a charging vehicle. The charging vehicle may be dispatched to meet the EV and deliver power to it, enabling the EV to reach a charging station or other destination. In some examples, the charging vehicle may deliver power to the EV while both vehicles are stationary. In other examples, the charging vehicle may couple to the EV while both vehicles are in motion.What is claimed is:
| 1. A computer-implemented method for charging an electric vehicle (EV) while the EV is moving, comprising:
obtaining, by one or more processors, a request to deliver electrical power to the EV;
identifying, by the one or more processors, at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving;
obtaining, by the one or more processors, EV location data and ACV location data;
determining, by the one or more processors, a coupling location based upon the EV location data and the ACV location data;
dispatching, by the one or more processors, the ACV to the coupling location; and
causing, by the one or more processors, the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period.
| 2. The computer-implemented method of claim 1, further comprising:
obtaining, by the one or more processors, battery data of the EV, wherein the battery data includes at least a charge level of the EV;
determining, by the one or more processors, a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination;
determining, by the one or more processors, that the charge level of the EV is less than the minimum charge level; and
dispatching, by the one or more processors, the ACV in response to determining that the charge level of the EV is less than the minimum charge level.
| 3. The computer-implemented method of claim 2, further comprising:
causing, by the one or more processors, a prompt to be displayed to a vehicle occupant requesting input from the vehicle occupant regarding whether to dispatch the ACV.
| 4. The computer-implemented method of claim 2, wherein:
the ACV is dispatched automatically.
| 5. The computer-implemented method of claim 1, wherein:
the ACV delivers the electrical power to the EV wirelessly.
| 6. The computer-implemented method of claim 5, wherein:
the ACV is configured to deliver the electrical power to the EV using a magnet system.
| 7. The computer-implemented method of claim 1, wherein the one or more processors are included in the ACV and the EV location data is obtained by the one or more processors using vehicle-to-vehicle communication between the EV and the ACV.
| 8. The computer-implemented method of claim 1, wherein the ACV is a drone.
| 9. The computer-implemented method of claim 1, wherein:
the ACV is a tow truck; and
the ACV delivers electrical power to the EV while towing the EV.
| 10. A computer system for charging an electric vehicle (EV), comprising:
one or more processors;
a non-transitory computer-readable memory coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the one or more processors to:
obtain a request to deliver electrical power to the EV;
identify at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving;
obtain EV location data and ACV location data;
determine a coupling location based upon the EV location data and the ACV location data;
dispatch the ACV to the coupling location; and
cause the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period.
| 11. The computer system of claim 10, wherein the executable instructions further cause the one or more processors to:
obtain battery data of the EV, wherein the battery data includes at least a charge level of the EV;
determine a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination;
determine that the charge level of the EV is less than the minimum charge level; and
dispatch the ACV in response to determining that the charge level of the EV is less than the minimum charge level.
| 12. The computer system of claim 10, wherein the executable instructions further cause the one or more processors to:
cause a prompt to be displayed to a vehicle occupant requesting input from the vehicle occupant regarding whether to dispatch the ACV.
| 13. The computer system of claim 10, wherein:
the ACV is dispatched automatically.
| 14. The computer system of claim 10, wherein:
the ACV delivers the electrical power to the EV wirelessly.
| 15. The computer system of claim 10, wherein:
the ACV is configured to deliver the electrical power to the EV using a magnet system.
| 16. A tangible, non-transitory computer-readable medium storing executable instructions for charging an electric vehicle (EV) that, when executed by one or more processors, cause the one or more processors to:
obtain a request to deliver electrical power to the EV;
identify at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving;
obtain EV location data and ACV location data;
determine a coupling location based upon the EV location data and the ACV location data;
dispatch the ACV to the coupling location; and
cause the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period.
| 17. The tangible, non-transitory computer-readable medium of claim 16, wherein the executable instructions further cause the one or more processors to:
obtain battery data of the EV, wherein the battery data includes at least a charge level of the EV;
determine a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination;
determine that the charge level of the EV is less than the minimum charge level; and
dispatch the ACV in response to determining that the charge level of the EV is less than the minimum charge level.
| 18. The tangible, non-transitory computer-readable medium of claim 16, wherein the one or more processors are included in the ACV and the EV location data is obtained by the one or more processors using vehicle-to-vehicle communication between the EV and the ACV.
| 19. The tangible, non-transitory computer-readable medium of claim 16, wherein the ACV is a drone.
| 20. The tangible, non-transitory computer-readable medium of claim 16, wherein:
the ACV is a tow truck; and
the ACV delivers electrical power to the EV while towing the EV. | The method (800) involves obtaining a request to deliver electrical power to an electric vehicle (EV) by processors (802). An autonomous charging vehicle (ACV) is identified (804) by the processors to charge the EV, where the ACV is configured to charge the EV while the EV and the ACV are each moving. EV location data and ACV location data are obtained (806) by the processors. Coupling location is determined (808) by using the processors based upon the EV location data and the ACV location data. The ACV is dispatched (810) to the coupling location by the processors. The ACV is caused (812) to deliver electrical power to the EV for threshold charging period by the processors, where the ACV and the EV are each moving during a portion of the threshold charging period. INDEPENDENT CLAIMS are included for: (1) a computer system for charging an electric vehicle EV; (2) a tangible non-transitory computer-readable medium comprising a set of instructions for charging EV. Computer-implemented method for charging an electric vehicle (EV) i.e. autonomous or semi-autonomous vehicle such as solar electric vehicle, during movement by using an ACV e.g. drone and tow lorry. The method enables charging the EV without the use of stationary charging stations in an efficient manner. The method enables allowing the ACV to determine an appropriate charge needed to allow the EV to travel to a charging station along the EV route to determine minimum amount of power necessary to provide enough fuel and charge for the EV so as to reach the nearest charging station. The drawing shows a flow diagram illustrating a computer-implemented method for charging an EV.800Computer-implemented method for charging an EV. 802Step for obtaining a request to deliver electrical power to an EV by processors. 804Step for identifying ACV by the processors to charge the EV, where the ACV is configured to charge the EV while the EV and the ACV are each moving. 806Step for obtaining EV location data and ACV location data by the processors. 808Step for determining coupling location by using the processors based upon the EV location data and the ACV location data. 810Step for dispatching ACV to the coupling location by the processors. 812Step for causing ACV to deliver electrical power to the EV for threshold charging period by the processors. |
Please summarize the input | Prevention planning for minimum risk strategyThe application claims a preventive planning of minimum risk strategy. The invention relates to a method executed by MRM planning system (1), the MRM planning system (1) is used for supporting the planning of the minimum risk strategy (MRM) of the automatic driving system (ADS) (21) of the vehicle (2). When the dynamic driving task (DDT) of the vehicle is executed by the autonomous driving mode of the ADS, the MRM planning system determines (1001) to the remaining distance (3) of the coming operation design domain (ODD) outlet (4), the ODD defined by the autonomous driving mode is to be finished at the outlet (4). When the remaining distance is shorter than the predetermined distance, the MRM planning system further evaluates (1002) data (5) associated with a section of the road (6) to the ODD outlet.|1. A method executed by the MRM planning system (1), wherein the MRM planning system (1) is used for supporting the planning of the minimum risk strategy MRM of the automatic driving system ADS (21) of the vehicle (2), the method comprises: when the dynamic driving task DDT of the vehicle (2) is executed by the autonomous driving mode of the ADS (21), determining (1001) to the remaining distance (3) of the upcoming operation design domain ODD outlet (4), an ODD defined for the autonomous driving mode is to be terminated at the outlet (4); when the remaining distance (3) is shorter than a predetermined distance, evaluating (1002) data (5) associated with a section of road (6) leading to the ODD outlet (4), the data (5) indicating a potential temporary and/or emergency stop area, road shoulder lane and/or margin space of one or more lanes along the one section of road (6), and/or potential occupation and/or disorder; based on the evaluation of the data (5) to identify (1003) an advantageous area (60) along the section of the road (6), the advantageous area (60) for potentially causing the vehicle (2) to be deemed safe to stop after the potential MRM is triggered; and determining (1004) starting the autonomous driving mode DDT termination program time (8), for example, providing the switching request (2) passenger taking the DDT at the time (8). The time (8) is calculated to occur a predetermined duration and/or distance before the vehicle (2) reaches an advantageous area (60) of the identification.
| 2. The method according to claim 1, wherein the identification (1003) advantageous area (60) comprises selecting the advantageous area (60) based on a selection criterion. The selection criteria is balanced between a corresponding security level due to stopping in certain areas and a proximity to the ODD outlet (4) corresponding to a certain area.
| 3. The method according to claim 1 or 2, wherein the evaluation (1002) comprises: evaluating a data (5) comprising a map-based data (51) derived from a digital map (22) such as a high definition HD map covering the section of the road (6); and/or evaluating data (5) comprising sensor-based data (52) derived from at least a first ambient detection sensor capable of capturing an ambient environment covering at least a portion of the section of the road (6).
| 4. The method according to claim 3, wherein said evaluating (52) data (5) comprising sensor-based data comprises collecting at least a portion of said sensor-based data (52) from one or more sensors out of said vehicle (2) through at least a first inter-vehicle communication service.
| 5. The method according to any one of claims 1 to 5 to 5, further comprising: transmitting (1006) a data (600) indicating an advantageous area (60) of the identification to the ADS (21) and/or to transmit (1006) in the ADS (21), for example, A track planner and/or a vehicle movement control device for transmitting (1006) to the ADS (21).
| 6. The method according to any one of claims 1 to 1 to 5, wherein the determining (1004) the time of initiating the autonomous driving mode DDT terminating program comprises determining a time for triggering the MRM.
| 7. An MRM planning system (1) for planning the minimum risk strategy (MRM) of an automatic driving system (ADS) (21) of a vehicle (2), wherein the MRM planning system (1) comprises: a remaining distance determining unit (101) for, when the dynamic driving task (DDT) of the vehicle (2) is executed by the autonomous driving mode of the ADS (21), determining (1001) to the remaining distance (3) of the upcoming operation design domain (ODD) outlet (4), an ODD defined for the autonomous driving mode is to be terminated at the outlet (4); a data evaluation unit (102) for, when the remaining distance (3) is shorter than a predetermined distance, evaluating (1002) data (5) associated with a section of the road (6) leading to the ODD outlet (4), the data (5) indicating a potential temporary and/or emergency stop area, road shoulder lane and/or margin space of one or more lanes along the one section of road (6), and/or potential occupation and/or disorder; a profit area identification unit (103), for based on the evaluation of the data (5) to identify (1003) along the beneficial area (60) of a section of road (6), the beneficial region (60) for potentially causing the vehicle (2) to be considered safe to stop after the potential MRM is triggered; and a time determining unit (104) for determining (1004) starting the autonomous driving mode DDT termination program time (8), for example, providing the vehicle (2) passenger taking the DDT at the time (8 switching request. The time (8) is calculated to occur a predetermined duration and/or distance before the vehicle (2) reaches an advantageous area (60) of the identification.
| 8. The MRM planning system (1) according to claim 7, wherein said advantageous area identification unit (103) is adapted to select said advantageous area (60) based on a selection criterion. The selection criteria is balanced between a corresponding security level due to stopping in certain areas and a proximity to the ODD outlet (4) corresponding to a certain area.
| 9. The MRM planning system (1) according to claim 7 or 8, wherein the data evaluation unit (102) is adapted to: evaluating a data (5) comprising a map-based data (51) derived from a digital map (22) such as a high definition (HD) map covering the section of the road (6); and/or evaluating data (5) comprising sensor-based data (52) derived from at least a first ambient detection sensor capable of capturing an ambient environment covering at least a portion of the section of the road (6).
| 10. The MRM planning system (1) according to claim 9, wherein the data evaluation unit (102) is adapted to collect at least a portion of the sensor-based data (52) from one or more sensors outside the vehicle (2) through at least a first vehicle-to-vehicle communication service.
| 11. The MRM planning system (1) according to any one of claims 7 to 11 to 12, further comprising: a data communication unit (106) for transmitting (1006) a data (600) indicating an advantageous area (60) of the identification to the ADS (21) and/or to transmit (1006) in the ADS (21), for example, A track planner and/or a vehicle movement control device for transmitting (1006) to the ADS (21).
| 12. The MRM planning system (1) according to any one of claims 1 to 7 to 11, wherein the time determining unit (104) is adapted to determine a time for triggering the MRM.
| 13. A vehicle (2), comprising the MRM planning system (1) according to any one of claims 1 to 7 to 12.
| 14. A computer program product comprising a computer program product having a computer program code module arranged to cause a computer or a processor to perform the steps of the method according to any one of claims 1 to 6 to 26, The computer program product is stored on a computer readable medium or carrier.
| 15. A non-volatile computer-readable storage medium having stored thereon a computer program product according to claim 14. | The method involves determining a dynamic driving task (DDT). Remaining distance to an upcoming operational design domain is obtained. Data associated with a stretch of road leading up is assessed to operational design domain (ODD) when the remaining distance is shorter than pre-determinable distance. A favorable area (60) is identified along the stretch of road for potentially bringing a vehicle (2) to a stop deemed safe subsequent based on the assessment of the data. A handover request is provided to prompt an occupant of the vehicle to take over the DDT. Gathering process is performed with support from a first inter-vehicle communication service. INDEPENDENT CLAIMS are included for: (1) a system for supporting planning of minimal risk maneuver of automated driving system of vehicle;(2) a computer readable storage medium for storing a set of instructions for performing a method for supporting planning of minimal risk maneuver of automated driving system of vehicle. Method for supporting planning of minimal risk maneuver (MRM) of an automated driving system (ADS) of a vehicle. The method enables providing precautionary actions to enable the vehicle to upon potential activation of the MRM having potentially lower risk maneuver to choose, and enabling for improved MRM planning. The drawing shows a schematic diagram of a method for supporting planning of minimal risk maneuver of automated driving system of vehicle. 1MRM planning system 2Vehicle 6Road segments 60Favorable area 61Emergency parking area |
Please summarize the input | avoiding the vehicle automatic driving vehicle capability test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle avoiding the vehicle capacity of the test field. comprising a road test field, a first reference test vehicle and operation management centre, management centre operation test control first reference vehicle front or back the vehicle to be tested and which can keep the distance to run, or the detected vehicle running in parallel, or reciprocal travel with the vehicle on the lane adjacent to the tested vehicle, then controlling the first reference vehicle changes the vehicle speed and/or driving direction to close to the tested vehicle. of this test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment to make the test result more accurately shows the vehicle avoidance capability close to the vehicle, compared with the actual road of road safer measuring, the test field can be used as standardized test scene for use of different automatic driving vehicle, the test result is more authoritative and reliable.|1. An automatic driving vehicle avoiding the vehicle capacity of the test field, wherein, comprising a road (1), a first reference vehicle (2) travelling on the road (1), the first reference vehicle (2) is an unmanned vehicle; control the first reference vehicle (2) can keep the vehicle to be tested (3) in front or behind the distance travelled, or controlling the first reference vehicle (2) and vehicle (3) driving, or controlling the first reference vehicle (2) running oppositely on the lane adjacent to the tested vehicle (3) of the tested vehicle (3). and then control the first reference vehicle (2) changing the vehicle speed and/or driving direction to close the operation test management center (5) of the vehicle (3), wherein the operation management centre (5) and the first reference vehicle (2) comprise a wireless communication device, wireless communication device with V2N communication protocol by the service test management centre (5) and the first reference vehicle (2) through respective V2N communication protocol with each communication connection, to test the operation management centre (5) for controlling the first reference vehicle (2) runs.
| 2. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein the first reference vehicle (2) has a V2N communication protocol of the wireless communication device further has a V2V communication protocol to the communication connection of the tested vehicle (3) and its own change vehicle speed and/or driving direction sending the early warning information to the vehicle (3) to the vehicle (3).
| 3. The automatic driving vehicle according to claim 2 avoiding the vehicle capacity of the test field, wherein it further comprises a road side device (4), the road side device (4) is set on the road (1) or the side, wherein the wireless communication device of the first reference vehicle (2) further comprises a V2I communication protocol can itself change the vehicle speed and/or direction of travel to close to the tested vehicle (3) when sending out the pre-warning information to the road side device (4), the road side device (4) comprises receiving the first reference vehicle (2) sends the warning information and the to-be-tested vehicle (3) sends the warning information. wireless communication device with V2I communication protocol.
| 4. The automatic driving vehicle according to claim 3 avoiding the vehicle capacity of the test field, wherein it further comprises: said road (1) comprises intersection (11) and chute (12) and the intersection (11), the road side unit (4) is arranged at the intersection (11).
| 5. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein it further comprises auxiliary road safety facility (7), the road additional safe device (7) is arranged on the road (1) or the side.
| 6. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein it further comprises a second reference vehicle (6) travelling on the road (1), the second reference vehicle (6) is an unmanned vehicle. the second reference vehicle (6) comprises a wireless communication device with V2N communication protocol of the wireless communication device by the service test management centre (5) and the second reference vehicle (6) through respective V2N communication protocol with each communication connection, to test the operation management centre (5) to control the second reference vehicle (6) travelling on the road (1) and at the periphery of the vehicle to be measured (3).
| 7. The automatic driving vehicle according to claim 6 avoiding the vehicle capacity of the test field, wherein the wireless communication device of the second reference vehicle (6) further has a V2V communication protocol to receive vehicle (3) sends the warning information.
| 8. The automatic driving vehicle according to claim 7 avoiding the vehicle capacity of the test field, wherein it further comprises the second reference vehicle (6) interconnected to collect light information of the second reference vehicle (6) and can be connected with the tested vehicle (3) interconnected to the collecting module of receiving and sending information and path planning information of the vehicle (3).
| 9. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein said road (1) comprises bidirectional lanes of unidirectional lane or without intermediate spacers. | The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle avoiding the vehicle capacity of the test field. comprising a road test field, a first reference test vehicle and operation management centre, management centre operation test control first reference vehicle front or back the vehicle to be tested and which can keep the distance to run, or the detected vehicle running in parallel, or reciprocal travel with the vehicle on the lane adjacent to the tested vehicle, then controlling the first reference vehicle changes the vehicle speed and/or driving direction to close to the tested vehicle. of this test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment to make the test result more accurately shows the vehicle avoidance capability close to the vehicle, compared with the actual road of road safer measuring, the test field can be used as standardized test scene for use of different automatic driving vehicle, the test result is more authoritative and reliable. |
Please summarize the input | an automatic driving vehicle continuously by crossing ability of test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle continuously by crossing ability of the test field. test field comprises road, at least two corresponding intersection, at each intersection are provided with a signal lamp device and capable of detecting the road condition and form a first traffic information of the road side device, each road side device and the corresponding signal lamp device communication connection to obtain the signal lamp information, each road side device comprises a to-be-tested vehicle sends wireless communication device of the first traffic information and signal lamp information. the test field compared to a computer simulated vehicles operating software the theoretical data more close to the actual running environment, the test result can more accurately show the tested vehicle continuous passing ability of the intersection, compared with the actual road test is more safe, the test field can be used as standardized scene for testing of different automatic driving vehicle, the test result is more authoritative and reliable.|1. An automatic driving vehicle continuously by crossing ability of the test field, wherein the test field comprises road (1) and the road (1) comprises intersection (2) at least two arranged at intervals; the road side device (5) correspondingly are equipped with a signal lamp device at each intersection (2) and capable of detecting the road condition and form a first traffic information of each road side device (5) is connected with corresponding signal lamp device communication to obtain the signal lamp information, each road side device (5) comprises a to-be-tested vehicle (3) sends the first traffic information and the signal lamp information, wireless communication device with V2I communication protocol.
| 2. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein the wireless communication device further comprises: an operation test management centre (7), the operation management centre (7) comprises a vehicle to be measured (3) through each intersection (2) before sending the second traffic information corresponding to the tested vehicle (3), having a V2N communication protocol.
| 3. The automatic driving vehicle according to claim 2 continuously passes through the crossing capacity of the test field, wherein each of said road side device (5) are connected by transmitting signal light information and the operation test management centre (7). wireless communication device by the service test management centre (7) can receive the driving route of the vehicle (3) sends information, the operation test management centre (7) further comprises according to corresponding to each intersection (2) of the signal lamp information and the second traffic information and the driving path information analysis processing to form the tested vehicle passes first of each intersection (2) suggested analysis module of travelling speed; the operation management centre (7) is also capable of running speed through the wireless communication device sending the first suggestion to the tested vehicle (3), the wireless communication device by the service test management centre (7) is connected with the analysis module.
| 4. The automatic driving vehicle according to claim 2 continuously passes through the crossing capacity of the test field, further comprising: a reference vehicle (8) travelling on the road (1) and at the periphery of the vehicle to be measured (3).
| 5. The automatic driving vehicle according to claim 4 continuously passes through the crossing capacity of the test field, wherein the reference vehicle (8) is an automatic driving vehicle, the reference vehicle (8) comprises a wireless communication device with V2N communication protocol; wireless communication device by the service test management centre (7) and the reference vehicle (8) through respective V2N communication protocol with each communication connection so as to test the operation management centre (7) controls the reference vehicle (8) runs.
| 6. The automatic driving vehicle according to claim 5 continuously passes through the crossing capacity of the test field, wherein the reference vehicle (8) is an unmanned vehicle.
| 7. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein it is set with at least three intersection (2), the distance between two adjacent intersection (2) is more than or equal to 150 m.
| 8. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein the signal lamp device comprises a signal lamp and a control signal lamp controller (6), the road-side device and the controller (6) in communication way.
| 9. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein, further comprising: a communication connection of the tested vehicle (3) so as to obtain the information of the tested vehicle (3) of the collection module. | The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle continuously by crossing ability of the test field. test field comprises road, at least two corresponding intersection, at each intersection are provided with a signal lamp device and capable of detecting the road condition and form a first traffic information of the road side device, each road side device and the corresponding signal lamp device communication connection to obtain the signal lamp information, each road side device comprises a to-be-tested vehicle sends wireless communication device of the first traffic information and signal lamp information. the test field compared to a computer simulated vehicles operating software the theoretical data more close to the actual running environment, the test result can more accurately show the tested vehicle continuous passing ability of the intersection, compared with the actual road test is more safe, the test field can be used as standardized scene for testing of different automatic driving vehicle, the test result is more authoritative and reliable. |
Please summarize the input | automatic driving vehicle lane ability test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle lane-changing capability of the test field. test field comprising a first reference vehicle with road of the first lane and the second lane, the first lane driving, control first reference vehicle in front of the vehicle to be measured running management centre operation test, wireless communication device running test management centre and first reference vehicle through respective V2N communication protocol with other communication connection to control first reference vehicle running, and running test management centre can send out the changing command or task command to the tested vehicle through the wireless communication device. the test field is more close to the actual running environment, the test result more accurately vehicle lane ability, compared with the actual road test is more safe, and suitable for different automatic driving vehicle, standardized evaluation of vehicle to be detected changing capability, the test result is more authoritative and reliable.|1. A testing field of automatic driving vehicle lane ability, wherein it comprises road (1) and the road (1) first reference vehicle comprises adjacent first lane (11) and the second lane (12) on the first lane (11) (2) the first reference vehicle (2) is non-human driving vehicle, the first reference vehicle (2) comprises a wireless communication device with V2N communication protocol, control the first reference vehicle (2) in front of the operation test management centre (4) vehicle to be measured (3); the operation management centre (4) comprises a wireless communication device with V2N communication protocol of the wireless communication device by the service test management centre (4) and the first reference vehicle (2) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the first reference vehicle (2) runs, and the operation test control center (4) can send out the changing command or running the task command to the tested vehicle (3) through the wireless communication device; wherein, when sending out the run task command condition, the run task command comprises a tested vehicle (3) is located to the mileage of the destination and to arrival time or to travel time, the operation test management centre (4) control the first reference vehicle (2) smaller than the distance and the required arrival time to the quotient of the difference between the current time, or the running speed of the quotient is less than the distance and the required travel time of running.
| 2. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a second reference vehicle (5), said second reference vehicle (5) is non-human driving vehicle, said second reference vehicle (5) comprises a wireless communication device with V2N communication protocol. wireless communication device by the service test management centre (4) and the second reference vehicle (5) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the second reference vehicle (5) on said second lane (12) and the tested vehicle (3) and the first reference vehicle (2) with both travel in the same direction, and the second reference vehicle (5) as the detected vehicle (3) has changeable distance.
| 3. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a third reference vehicle (6), said third reference vehicle (6) is non-human driving vehicle, said third reference vehicle (6) comprises a wireless communication device with V2N communication protocol. wireless communication device by the service test management centre (4) and the third reference vehicle (6) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the third reference vehicle (6) facing with the first reference vehicle (2) on the second lane (12).
| 4. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a wireless communication device side device (7), the road side device (7) is set on the road (1) or beside the road side device (7) comprises a road condition information to the vehicle (3) before changing the tested vehicle (3), having a V2I communication protocol.
| 5. The automatic driving vehicle according to claim 4, the said changing capability of the test field, wherein the road side device (7) further comprises automatically detecting the road condition detector, and the detector receives the road condition related information analyzing processing for forming traffic information processing module, said detector and said processing module connecting the processing module and the V2I communication protocol of the wireless communication device communication connection.
| 6. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein the operation test management centre (4) is also capable of changing the forward vehicle (3) sends the road condition information to be detected vehicle (3) by the wireless communication device.
| 7. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein the wireless communication device of the first reference vehicle (2) has a V2V communication protocol, the first reference vehicle (2) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.
| 8. The automatic driving vehicle according to claim 2, the said changing capability of the test field, wherein the wireless communication device of the second reference vehicle (5) has a V2V communication protocol, the second reference vehicle (5) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.
| 9. The automatic driving vehicle according to claim 3, the said changing capability of the test field, wherein the wireless communication device of the third reference vehicle (6) has a V2V communication protocol, the third reference vehicle (6) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.
| 10. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a vehicle to be tested (3) interconnected by collecting collecting module for receiving and sending information, operation information and route planning information of the vehicle (3). | The utility model claims an automatic driving vehicle performance testing technology field, especially relates to a testing field of automatic driving vehicle lane changing capability. testing field comprising a first reference vehicle having a first lane and a second lane of the road, running on the first lane, control the first reference vehicle in front of the vehicle to be measured running operation testing management centre; Wireless communication device of operation testing management centre and the first reference vehicle has V2N communication protocol through the respective communication connection with one another to control the first reference vehicle running and operation testing management centre can send command or running the task command to the vehicle through the wireless communication device. the testing field is more close to the actual running environment, testing result more accurately vehicle lane changing ability, compared with the actual road testing manner is more safety, and is suitable for different automatic driving vehicle, normalized to the vehicle lane changing ability evaluation, testing result more authoritative and reliable. |
Please summarize the input | an automatic driving vehicle field by testing ability of signal interference regionThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle through testing the ability of field signal interference region. test field facilities, road comprises road and signal interference comprises signal interference-free section and the corresponding facility of signal interference on signal interference section, test field further comprises a first road information of the operation test management centre to the detected vehicle running in signal interference section when the vehicle to be measured. the test field compared to no signal transceiver in the laboratory but only by the simulation environment of machine vision and computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can accurately represent the tested vehicle through capacity of signal interference region, and relative to the actual road test is safer, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable.|1. An automatic driving vehicle through capacity of signal interference region of the test field, wherein it comprises a signal interference device (2) of road (1) and the section of the road (1) is formed on the signal transmission interference. said road (1) comprises signal interference section (12) without signal interference section (11) and corresponding to the signal-to-interference device (2) in said signal interference; the driving section (12) the to-be-tested vehicle (3) to the to-be-detected road information of the vehicle (3) sends first operation test management centre (4); the operation management centre (4) comprises a wireless communication device with V2N communication protocol.
| 2. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein along said road (1) are provided with multiple signal for the interference device (2); said road (1) comprises respectively corresponding to a plurality of signal-to-interference equipment (2) of a plurality of signal-to-interference section (12) and multiple signal-to-interference-free section (11), the plurality of signal interference section (12) and said multiple signal-to-interference-free section (11) are alternately arranged.
| 3. The automatic driving vehicle according to claim 1 or 2 the ability of signal interference test field zone, wherein the signal-to-interference device (2) is set on the road (1) on two sides of the street, tunnel set above the road (1), set on the road (1) above the bridge, or meteorological analogue interference signal transmission of meteorological simulation device.
| 4. The automatic driving vehicle according to claim 3 the capability of signal interference region of the test field, wherein the length of the hideaways is greater than 50m; the length of the tunnel is greater than or equal to 50m, the tunnel made of concrete, or concrete and reinforcing steel. the width of the bridge is greater than or equal to 10m, and the length is more than 20m, the bridge made of concrete, or concrete and reinforcing steel, the meteorological simulation device comprises a meteorological simulation chamber and in the chamber producing a meteorological simulation interference signal transmission of meteorological weather simulator. said road (1) set through the weather simulation chamber, weather, interference of signal transmission the weather simulator manufactured by covering length is more than or equal to 30m.
| 5. The automatic driving vehicle according to claim 3, the ability of a test field signal interference region, wherein the hideaways comprises a plurality of hideaways segments with different densities, the length of each said hideaways section is more than or equal to 20m.
| 6. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, further comprising: a reference vehicle (6) used on the road (1) and running on the periphery of the vehicle (3) to be tested, wherein the reference vehicle (6) comprises a wireless communication device with V2V communication protocol.
| 7. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein it further comprises a forward vehicle on the tested vehicle (3) entering said signal interference section (12) (3) sends the road condition information of the road side device (5); the road side device (5) the signal interference-free section (11) or on side, comprising a wireless communication device for sending the road condition information to the tested vehicle (3), having a V2I communication protocol.
| 8. The automatic driving vehicle according to claim 7 the field of capability of the signal interference region, wherein the road side device (5) further comprises a detector capable of automatically detecting the road condition and/or input terminal of the manual input condition, and a processing module for processing the information.
| 9. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein the operation testing management centre (4) further comprises: controlling the V2N communication protocol of the wireless communication device in the vehicle (3) into the signal interference section (12) before sending the second road information to the tested vehicle (3), and issuing a first road information of the control module to the tested vehicle (3) in the vehicle (3) to be tested in said signal interference section (12) when driving.
| 10. The automatic driving vehicle according to claim 1 the field of capability of the signal interference region, further comprising: a device for the interconnection of the tested vehicle (3) by collecting module for collecting the information of the vehicle (3). | The utility model claims an automatic driving vehicle performance testing technical field, especially claims an automatic driving vehicle through testing field capability signal interference region. testing field comprises facilities, road road and signal interference comprises interference without signal section and a signal corresponding to interference of signal interference, test field further comprises emitting the first road information operation testing management centre to the vehicle when said to-be-detected vehicle driving road section in the signal interference. the testing field compared to simulated environment laboratory does not have signal transceiving but only by machine vision and compared with theoretical data of computer simulated vehicles operating software, which is more close to the actual running environment, the testing result can accurately represent the vehicle through capacity of signal interference region, and it is more safe to actual road testing, the testing field as standard scene for testing of different automatic driving vehicle, so as to make the test result more authoritative and reliable. |
Please summarize the input | the automatic driving vehicle avoiding dynamic obstruction capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test method and test field. the test field and the test method uses the test field comprises a road and the dynamic obstacle, scene arranged to control dynamic obstacle according to the travel route of the vehicle moving route of movement, and control the dynamic obstacle when the vehicle to be measured does not avoiding action under the condition that will collide with the vehicle position of the travel route and the moving route of the manner. so it is more close to the actual running environment, so that the test result can more accurately represent the vehicle the dynamic obstacle avoidance ability, and, compared with the actual road test is more safe. Furthermore, the test field and the test method can be used for testing different automatic driving vehicle, so as to make the test result more authoritative and reliable.|1. An automatic driving vehicle barrier capability of avoiding dynamic testing method, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road and the dynamic obstacle, the vehicle is placed on the road. controlling the dynamic obstacle according to the travel route of the vehicle moving route of movement, and control the dynamic obstacle to under the condition of avoiding action is made to sense a vehicle will collide with the vehicle position in the travel route and moves the moving route of the way; the measured vehicle response is the vehicle running according to the travelling route on the road, then avoiding the dynamic obstacle movement to avoid or reduce collision of the dynamic obstacle.
| 2. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1, wherein the scene arrangement further includes: between a vehicle capable of avoiding distance, control the dynamic obstacle appears in the machine vision acquisition range of the detected vehicle position and the travel route and the travel route intersecting the testing vehicle.
| 3. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: controlling a plurality of dynamic obstacle according to the travel route of the vehicle moving route of movement, the vehicle response further comprises: finishing each dynamic obstacle avoidance the vehicles to be detected.
| 4. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: the dynamic obstacle is dummy, animal model or non motor vehicle and the dummy. animal model or vehicle comprising a carrying or send location information and/or motion information of the device, the measured vehicle response further comprises the vehicle to be tested collected by the location information and/or motion information and combining the state after analyzing and processing. finally finishing the action of avoiding the dummy, to avoid collision of the dummy, animal model or non-motor vehicle, or the scene arrangement further includes a dynamic obstacle is the vehicle. said motor vehicle comprises a wireless communication device with V2V communication protocol, the wireless communication equipment with V2V communication protocol sent by the travel path information of the vehicle, the vehicle response further comprises the vehicle receiving the driving route information and combining the state after analyzing and processing. finally finishing the action of avoiding the vehicle, to avoid or reduce the collision of said vehicle.
| 5. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: said test field further comprises running a test management centre the operation test management centre sends the road condition information to the vehicle to be measured, the measured vehicle response further comprises the vehicle receives the road condition information and combining the state after analyzing and processing, finally finishing the avoidance action of the dynamic barrier so as to avoid or reduce collision of the dynamic obstacle.
| 6. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein the scene arrangement further comprises the test field further comprises a road side device; the road side device detects the sending warning information to the vehicles to be detected has a dynamic obstacle of the road after the vehicle response further comprises: said vehicle information device receives the warning sent by the road side and combined with the state after analyzing and processing, finally finishing the action of the dynamic obstacle avoidance. to avoid or reduce collision of the motor vehicle.
| 7. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: said test field further comprises a signal lamp and a sight barrier, the signal lamp is installed and allows the detected vehicle traffic on the road, the line-of-sight barrier blocking the vehicle through machine vision to find the position of the dynamic barrier.
| 8. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein the to-be-tested vehicle response further comprises one or several of the following: a, the vehicle remind the car in the front road condition, b. the vehicles to be detected to the dynamic obstacle warning action and/or early warning information is sent out.
| 9. An automatic driving vehicle obstacle avoiding dynamic capability of the test field, wherein it comprises road, the dynamic obstacle, travel route intersecting the moving route of the dynamic obstacle with the vehicle to be tested, crossing position on the road, and the dynamic obstacle can be said to-be-detected avoiding action of the vehicle does not collide with the vehicle at the intersection of manner.
| 10. The test field for avoiding dynamic obstruction capability the automatic driving vehicle according to claim 9, wherein said dynamic obstacles can have vehicle capable of avoiding distance, position with the intersecting between the testing vehicle appearing in the mechanical visual acquisition range of the vehicle to be measured, and the dynamic obstacle is dummy, animal model, non-motor vehicle or motor vehicle, wherein, said dummy, animal model or non-motor comprises carrying or sending device location information and/or motion information; the vehicle includes a wireless communication device capable of transmitting travel path information of the vehicle, has V2V communication protocol, the test field further comprises running test management centre, the operation test management centre comprises sending road condition information to the vehicle to be measured. Wireless communication equipment with V2N communication protocol, the test field further comprises set on the road or road side of the device, the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module; and wireless communication equipment with V2I communication protocol, the V2I communication protocol of the wireless communication device is capable of sending the warning information to the vehicle to be measured, the test field further comprises a signal lamp and a sight barrier, the signal lamp is installed on the road and allows the vehicle to be measured, the line-of-sight barrier blocking the vehicles to be detected through machine vision to find the position of the dynamic barrier. | The method involves placing a vehicle on a road for controlling dynamic obstacle according to a traveling route and a vehicle moving route. The dynamic obstacle is controlled under action avoiding condition. A vehicle position is determined by using the traveling route for moving a vehicle. Vehicle running response is obtained according to the traveling route on the road for avoiding dynamic obstacle movement and dynamic obstacle collision. Machine vision acquisition range of the detected vehicle position is determined based on the traveling route. An INDEPENDENT CLAIM is also included for an automatic driving vehicle performance testing device. Automatic driving vehicle performance testing method. The method enables testing different automatic driving vehicle so as to improve result testing reliability and automatic driving vehicle barrier capability. The drawing shows a schematic view of an automatic driving vehicle performance testing method. |
Please summarize the input | Automatic obstacle avoiding dynamic capability test field driving vehicleThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test field. test field comprising running route of road and the dynamic obstacle, a movement route of the dynamic obstacle, crossing with the tested vehicle is located on the road, and a dynamic obstacle to the to-be-tested vehicle does not make action under the condition that will collide with the vehicle at a running route and the movement route of the motion. In one aspect, such a test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can more accurately represent the ability of obstacle avoiding dynamic measured vehicle, and compared with the actual road of road safer measuring, on the other hand, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable.|1. An automatic driving vehicle of avoiding dynamic barrier capability testing field, wherein it comprises road; the dynamic obstacle, the dynamic obstacle movement path and the vehicle driving route crossing the intersection is located on the road. and the dynamic obstacle is capable to track a moving route of the vehicle to be measured is not made under the condition of avoiding action will collide with detected vehicle at the intersection manner, defining the dynamic obstacle, a driving drive device of the dynamic obstacle along the track.
| 2. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein the dynamic obstacle can be left between the vehicle sidestep with said intersection when the distance of the measured vehicle motion in the mechanical visual acquisition range of the vehicle to be detected.
| 3. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein, further comprising: automatically detecting the road condition and sending the early warning information to the vehicle when detecting the road side device with dynamic road condition of the barrier.
| 4. The automatic driving vehicle according to claim 3 the dynamic barrier ability of the test field, wherein the road side device comprises an automatic detector for detecting road condition the road condition related information received from the detector of the analytical process to form pre-warning information processing module and a wireless communication device with V2I communication protocol, and processing the detector module in the road side device, a processing module with a V2I communication protocol of the wireless communication device communication connection for information transmission, the V2I communication protocol of the wireless communication device is capable of sending the alarming information to the tested vehicle.
| 5. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a management centre operation test, the operational test management centre comprises sending road condition information to the to-be-detected vehicle, with the V2N communication protocol of the wireless communication device.
| 6. The automatic driving vehicle according to claim 5 the dynamic barrier ability of the test field, wherein the dynamic obstacle is dummy, animal model or non-motor vehicle, the dummy, animal model or non-motor vehicle comprises taking or sending location information for the mobile device and/or movement information, the operation test management centre communication connected with the said drive device for controlling the drive device to drive the dummy, animal model or non-motor vehicle movement.
| 7. The automatic driving vehicle according to claim 5 the dynamic barrier ability of the test field, wherein the dynamic obstacle for motor vehicle, said motor vehicle is an unmanned vehicle. the wireless communication device comprises sending the motor vehicle travel path information of the vehicle, having a V2V communication protocol and a V2N communication protocol, the operation test management centre and the vehicle are in communication connection through their own wireless communication device so as to test the operation management centre controls the vehicle running.
| 8. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a signal lamp, the signal lamp is installed on the road and indicates to allow the vehicle to pass.
| 9. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein the further barrier, the sight comprising: a sight barrier blocking the vehicle through machine vision finds the position of the dynamic barrier.
| 10. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a collecting module for interconnection with the tested vehicle for collecting the information of the vehicle. | The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test field. test field comprising running route of road and the dynamic obstacle, a movement route of the dynamic obstacle, crossing with the tested vehicle is located on the road, and a dynamic obstacle to the to-be-tested vehicle does not make action under the condition that will collide with the vehicle at a running route and the movement route of the motion. In one aspect, such a test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can more accurately represent the ability of obstacle avoiding dynamic measured vehicle, and compared with the actual road of road safer measuring, on the other hand, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable. |
Please summarize the input | automatic driving vehicle near the vehicle capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle near the vehicle capability testing method and testing field. the test field of the test field and the test method comprises a road and first reference vehicle, scene arranged to the front of the vehicle or behind the control first reference vehicle and keeps it capable of avoiding distance to run, the first reference vehicle and the vehicle running side by side, or control the first reference vehicle travelling with the vehicle on the lane adjacent to the vehicle to be measured, then controlling the first reference vehicle changing speed and/or direction of travel close to the vehicle. Thus it is more close to the actual running environment to make the test result more accurately shows the vehicle close to the avoidance capability of the vehicle, and compared with the actual road test is more safe. secondly, matching the detected vehicle response, standardizing the vehicle avoiding the evaluation of vehicle capability, test result authoritative and reliable.|1. An automatic driving vehicle close to the vehicle capability testing method, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road and the first reference vehicle, the vehicle is placed on the road. control the first reference vehicle front or back the vehicle to be measured and which keeps the avoidance distance in driving, or controlling the first reference vehicle and the detected vehicle running side by side, or control the first reference vehicle travelling with the vehicle on the lane adjacent to the vehicle to be measured. and then controls the first reference vehicle changes the vehicle speed and/or direction of travel close to the vehicle, the vehicle response is the vehicle begins to be tested normally travels on the road, when the first reference vehicle and to avoiding action of the first reference vehicle, to avoid or reduce collision the first reference vehicle.
| 2. The avoiding vehicles closing capability test method the automatic driving vehicle according to claim 1, wherein the scene arrangement further includes the first reference vehicle comprises wireless communication equipment with V2V communication protocol. the first reference vehicle itself changing the vehicle speed and/or driving directions from the early-warning information to the vehicles to be detected through the V2V communication protocol of wireless communication device close to the vehicle, the vehicle response further comprises: the vehicles to be detected receives the first reference vehicle sends warning information and combining the state after analyzing and processing, finally finishing the avoidance operation of the first reference vehicle, to avoid or reduce collision the first reference vehicle.
| 3. The automatic driving vehicle according to claim 2 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises the road side unit, the road side device set on the road or beside. the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module and a wireless communication device having a V2I communication protocol, the wireless communication device sending processing module to the vehicles to be detected to form pre-warning information. the wireless communication device of the first reference vehicle further comprises V2I communication protocol, the first reference vehicle sends warning information to the road side device through the wireless communication device. the road side equipment sends the pre-warning information to the vehicles to be detected after receiving the first reference vehicle sends warning information; the vehicle response further comprises: said vehicle information device receives the warning sent by the road side and combined with the state after analyzing and processing, finally finishing the avoidance operation of the first reference vehicle. to avoid or reduce collision the first reference vehicle.
| 4. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises the second reference vehicle, control the second reference vehicle travelling on the road and around the vehicle to be measured, the measured vehicle response further comprises: the tested vehicle avoiding action of the first reference vehicle while avoiding collision of the vehicle with the second reference.
| 5. The automatic driving vehicle according to claim 4 near the vehicle capability test method, wherein the scene arrangement further comprises the second reference vehicle comprises wireless communication equipment with V2V communication protocol; the vehicle response further comprises the vehicle to be tested to the second reference vehicle early warning information is sent out.
| 6. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises additional safe facilities on the road or road side the testing vehicle on the adjacent lane of the road safety facility running, the vehicle response further comprising: avoiding action of the vehicles to be detected while avoiding the first reference vehicle and the road safety facility collision.
| 7. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises running a test management centre, the operation test management centre comprises a wireless communication device with V2N communication protocol, the measured vehicle response further comprises the vehicle to be tested to the operation test management centre sends the road condition information.
| 8. The automatic driving vehicle near the vehicle capability according to claim 1 testing method, wherein the testing vehicle response further comprises one or several of the following: a, the vehicle remind the car in the front road condition, b. the vehicles to be detected remains itself located on the road while avoiding action of the first reference vehicle, c. said detected vehicle alarm action.
| 9. An automatic driving vehicle avoiding the vehicle capacity of the test field, wherein, comprising a road vehicle a first reference, the first reference vehicle can be front or back the vehicle to be measured and which keeps the avoidance distance to run, or with the vehicle to be measured are running, or running opposite to the vehicles to be detected on the lane adjacent to the vehicle to be measured, and then change the vehicle speed and/or direction of travel close to the vehicle.
| 10. The avoidance test field near the vehicle capacity of the automatic driving vehicle according to claim 9, wherein it further comprises a second reference vehicle, the second reference vehicle can travel on the road and around the vehicle to be measured. the second reference vehicle comprises wireless communication equipment with V2V communication protocol, road side unit, the road side device set on the road or beside the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module; and a wireless communication device having a V2I communication protocol of the wireless communication device to the vehicle to be measured sends early warning information processing module is formed of the first reference vehicle comprising a wireless communication device of the V2V communication protocol and the V2I communication protocol. the first reference vehicle sends warning information to the road side device and the vehicles to be detected through its wireless communication apparatus, the road side equipment sends the pre-warning information to the vehicles to be detected through its wireless communication apparatus through the wireless communication device receives the first reference vehicle sends the pre-warning information; road safety facility, the road safety facility is set on or beside the road, and management centre operation test, the operational test management centre comprises a wireless communication device with V2N communication protocol. the wireless communication device of the first reference vehicle further comprises V2N communication protocol, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection to test the operation management centre controls the first reference vehicle running. | The method involves placing a main vehicle on a road. A reference vehicle is controlled for running side by side to maintain distance during driving. The reference vehicle is traveled with the main vehicle on the road in an adjacent position. The reference vehicle is controlled to change vehicle speed and/or direction of travel to the main vehicle for avoiding collision of the reference vehicle and the main vehicle. The reference vehicle is provided with a wireless communication equipment that is provided with a vehicle-to-vehicle (V2V) communication protocol. An INDEPENDENT CLAIM is also included for an automatic driving vehicle avoiding the vehicle capacity of the test field. Automatic driving vehicle capability testing method. The reference vehicle is controlled to change the vehicle speed and/or direction of travel to the main vehicle for avoiding the collision of the reference vehicle and the main vehicle, thus realizing a test result authoritative and reliable. The drawing shows a schematic view of an automatic driving vehicle capability testing system. |
Please summarize the input | the automatic driving vehicle lane change capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially relates to a test method and a test field of automatic driving vehicle lane change capability. the test field of the test field and the test method comprises path, in operation test management centre and a first reference vehicle, road comprises adjacent first lane and a second lane, the vehicle is placed on the first lane, the first reference vehicle to be tested on the first lane, and the vehicle ahead, then operation test management centre sends switching command to the vehicle or driving task command. the test method and the test field is more close to the actual running environment and test result can more accurately show the changing capacity of the vehicle to be tested, compared with the actual road test manner is more safety, and it is suitable for different automatic driving vehicle, normalized to vehicle changing ability evaluation, test result more authoritative and reliable.|1. A test method of automatic driving vehicle lane change capability, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road, running test management centre and the first reference vehicle. said road comprises adjacent first lane and a second lane, the vehicle is placed on the first lane, control the first reference vehicle running on said first lane and in front of the vehicle to be measured, then operation test management centre sends switching command to the vehicles to be detected or running task command; wherein the running task command includes the vehicle position to the destination of the mileage and a required arrival time or running time, and sending the running task command. controlling the running speed of the first reference vehicle is less than said distance and said required arrival time to the value of the current time, or less than the distance and the required travel time of the provider, the vehicle response is the vehicles to be detected following the first reference vehicles travelling on the first lane. after receiving the channel changing command or the running task command, the vehicle safely enter into the second lane.
| 2. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: under the condition of issuing the command, the channel changing command to between the adjacent two lane two times to realize the overtaking the first reference vehicle, the vehicle response further comprises: the vehicles to be detected after entering the second lane safely enter again the first lane and located between the first reference vehicle ahead.
| 3. The method according to claim 1 or 2, the automatic driving vehicle lane change capability test method, wherein the scene arrangement further comprises the test field further comprises the second reference vehicle, control the second reference vehicle around on said second lane, and the vehicle to be measured and the first reference vehicle and running in the same direction, the second reference vehicle is the vehicle to be measured has changeable distance.
| 4. The automatic driving vehicle according to claim 3, the capability of the method, wherein, the scene arrangement further comprises: the first reference vehicle and the second reference vehicle comprises wireless communication equipment with V2V communication protocol, the first reference vehicle and the second reference vehicle through wireless communication device receives the changing request sent by the vehicle, and after receiving the switching request response information agree, the measured vehicle response further comprises: the vehicles to be detected vehicle to the first reference and the second reference vehicle sends changing request, and receiving the first reference vehicle and the second reference vehicle replies agreeing information after changing action is performed.
| 5. The automatic driving vehicle according to claim 4, the capability of the method, wherein the operation test management centre comprises a wireless communication device with V2N communication protocol. the wireless communication device of the first reference vehicle further comprises V2N communication protocol, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection to test the operation management centre controls the first reference vehicle to run; the wireless communication device of the second reference vehicle further comprises an V2N communication protocol, a wireless communication device in the operation test management centre and said second reference vehicle through each with V2N communication protocol of each communication connection to test the operation management centre controls the second reference vehicle running.
| 6. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: said test field further comprises a third reference vehicle and controls vehicle travel opposite the third reference and said first reference vehicle on the second lane. the operation test management centre comprises a wireless communication equipment with V2N communication protocol, the third reference vehicle comprises wireless communication equipment with V2N communication protocol, the wireless communication device of the operation test management centre and the third reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the third reference vehicle is running, the first reference vehicle comprises wireless communication equipment with V2V communication protocol, the wireless communication device of the third reference vehicle with V2V communication protocol. the first reference vehicle and the third reference vehicle through wireless communication device receives the changing request sent by the vehicle, and after receiving the switching request response information agree, the detected vehicle response further comprises the vehicle to be tested and the third to the first reference vehicle reference vehicle sends changing request. and receiving the first reference vehicle and the third reference vehicle reply the consent information after the changing operation is performed.
| 7. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: said test field further comprises a road side device, the road side device set on the road or beside. the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector to analyzing wireless communication device processing a traffic information processing module, and a V2I communication protocol, the wireless communication device sends the processing module formed by the road condition information to the vehicle to be measured before the vehicles to be detected; the measured vehicle response further comprises the vehicle receives traffic information device sent by the road side and combined with the state after analyzing and processing, finally finishing the changing action.
| 8. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: the operation test management centre comprises a wireless communication device with V2N communication protocol. the operation test management centre sends the switching command or running the task command to the vehicles to be detected through the wireless communication device, the operation test management centre further sends the road condition information to the vehicles to be detected in front of the vehicle to be measured, the measured vehicle response further comprises: the vehicles to be detected receives the operation test management centre sends the road condition information and combining the state after analyzing and processing, finally finishing the action, and the vehicle to be measured to the operation test management centre sends the changing information.
| 9. A test field of automatic driving vehicle lane change capability, wherein it comprises road, the road adjacent a first lane and a second lane, a first reference vehicle, the first reference vehicle can move on the first lane and the vehicle ahead, operation test management centre, the operation test management centre comprises a wireless communication device with V2N communication protocol, the operation test management centre can send command or running the task command to the tested vehicle through the wireless communication device; wherein sending the running task command under the condition that the running task command comprises the position to the vehicle mileage and the requirement of the destination arrival time or to travel time, travel speed of the first reference vehicle is less than said distance and said required arrival time to the value of the current time, or less than the distance and the required travel time of running.
| 10. The automatic driving vehicle according to claim 9, the capacity of the test field, wherein it further comprises a second reference vehicle or third reference vehicle, wherein the second reference vehicle can around on said second lane, and the vehicle to be measured and the first reference vehicle and running in the same direction, the second reference vehicle is the vehicle to be measured has changeable channel distance, the third reference vehicle can run counter to the first reference vehicle on the second lane; road side device, the road side device set on the road or the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form traffic information processing module; and wireless communication equipment with V2I communication protocol, the road side device can send processing module to the vehicles to be detected to form traffic information before the vehicle through the wireless communication device, wherein the operation testing management centre further sends the road condition information to the vehicles to be detected before the vehicles to be detected; wherein the first reference vehicle comprising a wireless communication device of the V2V communication protocol and the V2N communication protocol, the first reference vehicle capable through its wireless communication apparatus receives the changing request sent by the vehicle and receiving the lane Obtaining and restoring the consent information, and the first reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the first reference vehicle running, wherein the second reference vehicle comprises wireless communication equipment with V2V communication protocol and the V2N communication protocol. the second reference vehicle can through its wireless communication apparatus receives the changing request sent by the vehicle and return consent information after receiving the switching request, and the second reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and said second reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the first reference vehicle driving, wherein the third reference vehicle comprises wireless communication equipment with V2V communication protocol and the V2N communication protocol. the third reference vehicle can through its wireless communication apparatus receives the changing request sent by the vehicle and return consent information after receiving the switching request, and the third reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and the third reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the third reference vehicle running. | The method involves placing a vehicle on a first lane. Vehicle running process on the first lane is controlled. A switching command is transmitted to the vehicle by an operation test management center, where the switching command includes vehicle mileage, required arrival time and running time period. A running task command is obtained when running speed of the vehicle is lesser than distance and the required arrival time period. Vehicle response is determined. The vehicle is entered into a second lane when a channel changing command and the running task command are received. Automatic driving vehicle lane change capability testing method. The method enables determining actual vehicle running environment and changing capacity of the vehicle, and improving test result accuracy and reliability with high safety. The drawing shows a schematic view of an automatic driving vehicle lane. |
Please summarize the input | Apparatus and system to manage monitored vehicular flow rateThe system apparatus manages and adjusts monitored traffic density and/or speed in relationship to spatial locational flow rates, through a plurality of mobile and/or stationary encrypted communication device sensors, system transmits calculated digital navigational directives throughout a network of domains within any infrastructure in direct secure communication with humans, drivers and/or owners and/or vehicle transports mechanism themselves, viewable and/or audibly seen on iNavX2 virtual interface, providing on demand dimensional spatial mapping locational services, driver and/or vehicles system generated requests for certified advertisements and tracking transmitted navigational maneuvers.The invention claimed is:
| 1. A method, comprising a secure navigational system and apparatuses networked within at least one domain and configured within at least one infrastructure for a manageability of monitored aggregate vehicular traffic density maintaining variable sustained vehicle velocity, and further comprises the following elements: characterized by
at least one central system server configured to perform calculable equations; transmit navigational directives, suggestions and advisements for purpose of to: track, record, measure and maintain spacial density between each transport at a calculated speed in relationship to phase-change spatial analytics from artifacts calculated from current positional points in relationship to prior at least one future positional phase point in space time location for at least one vehicle, a human, motorcycle, bicycle transmitting current Longlat coordinate location through a plurality of encrypted communication devices to the system;
wherein system transmits modifiable speed variations consistent with original routes destination time frames, and provide alternate routes based on current computed location with projected congestion artifact and density computed variables, each comprised of weather variations causing reductions in flow rates, speed or velocity of at least one vehicle transports calculations and for each transport, a mass of vehicle transports transmitting calculated adjustments in-from forecasted equations in to one or more vehicles in one or more vector areas within at least one domain, detecting a predicated or predictable eventuality of congestion level change in future time space, responding to driver, or from vehicles navigational device requests, system determines vehicles mechanism anticipated communication interoperability issues in communication time-lag in responses to transmissions or requested data from the system, including ecomm-advice, one or more a plurality of certified complaint encrypted communication devices, a vehicles equipped with Qbvipro, pAvics and other similar certified compliant communication device equipped for vehicles transports, and one or more programs configure within system calculates and advises one or more vehicles, drivers, owners, operators or vehicles controller requested alternate routes or system generated based on density factors, constructions areas along with human factors, vehicle capabilities and capacity to navigate:
wherein the system computes and compares speed density variables in calculated flow rates from change in weather conditions on selected routes or entire areas affected by atmospheric changes to assist minimizing the need to break inertia, control safe density factors and manage flow rates, maximizing constant variable velocity throughout for at least one vehicle transport or a plurality of vehicle transports within the domain that is networked within an infrastructure, and transmits to a single vehicle or at least one vehicle within a mass that is not registered due to communication error and that is registered authenticated link sync'd and paired with a plurality of ecomm-devices and system;
wherein the system comprises adjustments in calculated configured spatial density occur in proportion with current velocity for at least one transport within a mass or a cluster of transports or a single vehicle transmitting encrypted navigational directives, and further performs calculated adjustments from spatial phase-change density in future time in proportional to spatial velocity continuously, including cross-layer intersections, modifying speed to one or more transports calculated from optimized overlay models allowing vehicular traffic to flow safely; and
further the secure navigational system and each encrypted communication device apparatus, comprises one or more non-transitory computer-implemented programmed methods, further comprised of a plurality of storage mediums, in communication with and comprised of at least one server configured in a localized cloud sync'd enterprise configuration communicating through a sub navigational system transmitting, receiving reactive navigational directives, recording die amount of vehicular traffic proximity density at a time frame (moment) relative to a specific point in time space, compared with the speed of the a calculated comparative clustered mass area chosen and compared with other determined vehicles within the same mass, calculating common variations in predictable calculated overlay models and provide calculable results to manage vehicle flow rates at future points in time space or within a selected vector area, and
further comprising archived data-analytics, real-time data artifacts and other realtime telematic data transmitted to system from a plurality of device sensors synchronize link sync'd with at least one vehicle transport, transmitting engine informatics through a plurality of link sync'd ecomm-device sensors, in further communication with at least one networked server hub each authenticate, sync-d, paired and securely registered within at least one domain networked connected to each in direct communication with each other and at least one central server within at least one domain networked infrastructure;
wherein system further comprises a plurality of communications and advertisements encrypted for secure transmissions between vehicles, a plurality of ecomm-devices sensors including Object Functionality Points or Proximity Integration on vehicles and servers within a domain networked architecture; and
wherein system server, connected cloud based synchronized servers and all other ecomm-devices OS, NOS and other certified applications used within the domain networked infrastructure that receives and transmits a plurality of data, information, directives, suggestions and advertisements including certified tribal entities, owners, and drivers viewable device the construct comprises one or more of a plurality of machine languages, and one or more parts thereof converted to binary and one or more parts of the machine languages and there converted binary parameters or parts thereof along with languages are converted to binary codes are encrypted;
wherein dominion of all data analytics, data facts, data relics and all other telematic data transmitted, received, extracted, responded and acknowledged to and equipped with at least one transmitter, receiver, transponder and other certified compliant telematic ecomm-devices, including a plurality of towers in direct communication with a plurality of vehicle transports, purposely generating income for each city, township, and municipalities, state and federal DOT;
wherein the plurality of ecomm-device sensors are each equipped with at least one network interface for each stationary and mobile ecomm-devices and other compliant certified devices and servers in communication with one or more of a plurality of remote mobile and stationary encrypted device sensors equipped with Avics, each further constructed of and equipped with a media suitable for storing and processing electronic instructions to maneuver vehicle transports, to increase, decrease flow rates, velocity or speed, instruct vehicle transport or its mechanism and/ or a human to turn on vehicles lights, blinkers, transmitting instructions and communications to vehicles, mechanism or humans from system generated or one or more ecomm-devices or Obvipro detected with evasive maneuverable recommendations, directives, instructions and lane change acquisition, and further comprises other navigational maneuvers, to perform a plurality of tasks, procedures, performing analytical calculations, computations and other mathematical equations embedded within, and comprised on at least one iChipSet RFIDGPS equipped with a transponder/receiver/transmitter and entire device comprises a plurality of modules;
wherein said receiving and transmitting data comprises encapsulated encrypted data exchange or push transmissions from a plurality of at least one ecomm-device and other certified compliant devices, comprised of a configured plurality of at least one iChipset programmed to perform of at least one task or a plurality of at least one procedure, event, calculations to disseminate informatics either current or artifacts;
wherein one or more ecomm-devices sensors and vehicles communicate through one or more sub navigational systems (subnaysys), computing devices configurable variable frequencies to: transmit, receiving and responding to transponders requested signal data from a plurality of strategically calculated positioned stationary and mobile communication ecomm-device sensors and other compliant certified devices attached to a plurality of vehicle transports or from humans; and
further equipped with at least one Avics iChipset module constructed in and configured as a hardware, software downloadable application, a hardware software device or combination thereof, comprises Obvipro, pAvics and other certified ecomm-devices within any domain networked infrastructure, system transmits dimensional mapping locational services displayed on a virtual interface calculable from positional point transmitted by vehicle devices and a plurality of smart devices locational services; wherein each infrastructure comprises one or more residential domains, shopping center domains, and to be determined by a configured network protocol for each domain that is networked within municipalities, city or townships infrastructure, that may further comprised, of a plurality of modules, additional cameras and plurality of other certified compliant communication devices and sensors within one or more specific networked domains for additional security measures fer in certain vector areas, in conjunction with a plurality of hub sensor devices within at least one vehicular traffic domains infrastructure;
wherein each interface being configured to transmit, receive and to respond to at least one transponders signal of different frequencies at the same time that may pulsate;
wherein each transmitted dataset, data analytics, data facts and artifacts from vehicle transports mechanical, mechanical-electrical and electrical sensors and all other system ecomm-devices and transmitted ecomm-advice each is timestamped for traffic data and other informatics entering and existing a plurality of encrypted communication device link sync'd together within a cryptic VPI connection using encrypted data push transmissions, via a VCPI (Virtual Cryptic Private Infrastructure) tunnel to one or of a plurality of encrypted communication devices and by other secure means collecting, transmitting, receiving and transponder signals and responding signal data feed into a central processing complex of at least one fink sync'd cloud based Sync'd server and system responds with at least one encrypted Paired-Key acknowledging ecomm-device, Obvipro, pAvics and other certified compliant devices within the network infrastructure;
wherein said plurality of each encrypted ecomm-devices, are load-networked within a given domain networked area channeling said plurality of communications parts, through one or more selected determined configured sub navigational system tracked paths or routes to system server;
wherein infrastructure transmission devices are Sync'd to each other after registration and authenticated and paired with and to at least one other communication sensor device, vehicle, a server or the plurality of ecomm-devices in direct communication with each and system and link sync'd servers with system server, assisting in threat intelligent analytics and analysis;
wherein traffic data further comprises: vehicles phase-change spatial location, positional congestion artifact relationship with past phase-change position to calculated future phase-change spatial positional point in time space, vehicle informatics transmitted to system calculating forecasted flow rates in future spatial time for at least one vehicle transport within a mass or a single vehicle and other calculated derived intelligence necessary to perform real-time calculable equations to securely navigate vehicular traffic, further received from and by a plurality of mobile ecomm-device sensors deployed in a plurality of vehicles transmitting throughout a plurality of one or more vector-hub class sensor device hubs;
wherein system further comprises at least one AlphaVectorHub and AlphaHubs and one or more are combined with other sub hub-class and other certified compliant communication device sensors, nodes and virtual nodes transmitting secure selected advertisement requests from one or more drivers, vehicle, transports, humans and a plurality of mechanisms strategically arranged along one or more roadways, each device and advertisement is displayed in a virtual reality interface configured within iNavX2 for a plurality of transports and same being viewable in and configured to display same in at least one iNavCom center and facilities:
wherein stored particulars comprise a plurality of strategically located servers, each having at least one non shared encrypted database, in secure direct communication with at least one central server within each domain networked infrastructure, purposely to perform calculations, to detect approaching and passing vehicles, archived data artifacts, relics and disseminate shared data across tribal entities and for the purpose of to reconstruct anomalies and accidental occurrences from humans, bicycles, motorcycles, and the derived causation of a plurality of system disturbances along with Longlat positional time points before and after incident of each including on private property; and
further supported by one or more data facts comprised of: telematic data archived and indexed, comprised of: time intervals of communications and distance/time data entered/existed a plurality of compliant certified communication devices and cameras; vehicles travel speed, volume in relationship with topography and climatic conditions, flow rates and density and ecomm-device markers and time intervals and their VarChk Index, including transponder and response signal indicators and all other monitored and recorded categorized and cataloged informatics for each driver or owner and their associated elements and factors including registered location for each; vehicle, transport, motorcycle, bicycles and humans recorded in a indexed data book, each analytically compared independently and combined with one or more data artifacts collected, computing precise geographic positional location, and further verified from renderings and overlay map-objects created by each effected vehicle involved in accident or mishap, the computed result of which determines whether the human driver or owner was at fault or communications interoperability issues with system or ecomm-devices, Obvipro, pAvics or other certified complaint devices or a plurality of vehicles or transport were found or a mechanism itself was the causation of accident, incident or a mishap;
wherein a central server, ecomm-devices, Obvipro, pAvics and other certified compliant devices are configured to receive, transmit and respond to transponders signals each of variable frequencies that may pulsate and one or more parts are encrypted, communicating a plurality of traffic data artifacts and informational data to and from a plurality of calculated strategically positioned remote stationary and mobile hub ecomm-devices, sensors, and other certified compliant device sensors throughout a secure sync-d linked network within a virtual private infrastructure; therein
creating a channeled telematics network from a plurality of encrypted communication devices, and sensors equipped with at least one Avics iChipset, for each Obvipro, pAvics and other certified compliant encrypted communication devices and senors use UPnP telematic discovery service; and further transmitting and receiving calculated encrypted digital ecomm-advice directives to and from traffic vector-hub class communication sensor devices updating traffic data in one or more non shared databases; and
transmitting continuous updates for the purpose of to calculate traffic density in proportion to rate flow for one or more vehicles traversing along one or more roadways based on updated traffics dynamic data; and
transmit timing adjustments for speed, flow rates and spatial density throughout a network of secure devices to one or more transports, traffic lights, tVectorHubs and others similar compliant certified ecomm-devices and sensors to advise and transmit to a plurality of vehicles, and transports, along with quantum vector nodes and virtual nodes displaying existing and future contemplated traffic signage in a virtual interface, further providing fuel, eating and other services travelers need, requested autonomously from vehicles Obvipro's, pAvics and system informational recommendations based onboard vehicles processors current integrated capabilities with Proximity Integration, formulating XY (Long/Lat) coordinates laced together with spatial third dimension Z-Topography and Climatic Expectations commixed with human factors, vehicle capabilities to navigate, self drive-ability guided by encrypted navigational directives, vehicles reactive capacity to self navigate with instructions from the system, along with transmissions from other certified compliant vector encrypted ecomm-devices based on optimal traffic flow calculations determined by speed variations in relationship to and from traffic density, human, vehicle capability and capacity factors, continuously creating and updating calculated overlay models transmitting adjustments in vehicular velocity rate flow in proportion with density between at least two transports, a clustered mass of transport vehicles or a single vehicle within the mass;
wherein communication links are pre-configured with one or more ecomm-devices when deployed or determined by forecasted forward velocity based on recorded and posted traffic speed, and for security reasons moving data transmission in advance of transports calculated future positional points, each stationary and mobile ecomm-devices only communicate with certified compliant devices link sync'd registered paired and authenticated with system, ecomm-advices and other certified compliant devices and sensors, creating a primary line of defense for structured network protocol security established by a configured baseline for channeled telematics;
wherein comprising system transmits navigational directives to one or more certified compliant vector ecomm-devices, comprised of optimal traffic flow calculations determined by speed variations in relationship to traffic density to minimize the need to break inertia, transmitting encrypted timing adjustments over a network to one or more vehicles via encapsulated digital encrypted voice and displayed in a virtual interface for vehicles and navigational command centers and facilities and further providing encrypted virtual commands between one or more traffic lights, tVectorhubs, cameras and other compliant certified communication devices for an intersection within a domain networked infrastructure and roadway areas equipped with VectorHub Class ecomm-devices, sensors, and other types of sensor devices;
wherein tVectorHubs, vector-hubs, sub hub-class sensors, VectorHub Class ecomm-devices one or more are combined with BeaconHubs, SentryBeaconHubs, SentryHubs, SentinelHubs, Sentry Nodes, AlphaHubs, AlphaVectorHubs; b, c, d and xVectorHubs, and each sub hub sensor devices configuration preforms performs at least one explicit task, a duty of which is specifically allocated and configured for any given sector within networks infrastructure area; and
further comprised of each device sensor and other certified compliant device sensors, ecomm-devices are equipped with a configuration of at least one iChipset, the construct comprising at least one non-transitory machine computer-readable medium constructed with a plurality of storage mediums, at least one interface, and further comprising a non-transitory computer device configured for receiving, to transmit, and respond to transponder calls, to mark data upon entering and existing a plurality of encrypted communication devices, sensors, electrical, mechanical or electrical-mechanical device; and
further configured to determine vehicle locational positions and to detect, to warn, to advise, suggest, alert, to respond to requests from humans, a transport, or a mechanism configured within and attached to the vehicle transport, comprised of a plurality of certified registered link sync'd ecomm-devices, authenticated and paired within systems infrastructures communication devices and servers, configured with at least one Paired-Key and at least one response match set for each Paired-Key set assigned within the OS's NOS within the certified compliant ecomm-device, a device, or sensors carried by humans, attached to temporarily or permanently, comprised of a smart device installed with pAvics application or a hardware software device Obvipro and other certified ecomm-devices, and attached temporarily or permanently to a plurality of vehicle and transports;
wherein system comprises encrypted transmitted directives to turn head fights on for autonomous and semi-autonomous vehicles and transports are audibly or visually instructed, recommended rest time intervals for personal and commercial vehicle drivers for vehicles equipped with or without technologies monitoring drowsiness with audible voice statements to assist with aiding driver in keeping awake or system detects driver attentiveness is declining; and
further transmit visual and audible notifications to vehicle with destination arrivals time, alternate routes, maintenance items, declining fuel notifications recommending fuel node stops in a virtual interface, transport mechanism inspections, insurance, valid tags and vehicle ownership records and verification from at least one data-base and other associated informatics, said data is stored in at least one iChipset within each Obvipro and other certified compliant devices and securely stored within non-transitory machine computer-readable medium with a plurality of storage mediums and archived in at least one system servers database, and a downloadable software applications for humans, bicycles, motorcycles and of any other type certified compliant device and further comprising a Nuclex operating system providing designated lane isolation or acquisition; and
further transmitting digital directives maintaining distance factors between vehicles based on density flow rates in proportion with weight loads, historical records of vehicles capabilities, capacity to stop, self navigate and navigate by a human factors associated with each driver and their registered recorded vehicle or transport, directives formulated from computed variables associated with each particular vehicles cataloged informatics, navigational suggestions as requested by driver, owner or system derived directives for autonomous vehicles, each audibly heard or virtually viewable or both audible heard and viewable at the same time;
wherein system comprises analyzed dynamic analytical rate flow (DARF) in comparison to and in conjunction with calculated dynamic analytical lane allocations available, to assistant in maintaining vehicular spatial distance and dedicated transport positioning, along with dynamic directional flow constraints calculated inputs from network traffic congestion artifacts allowing vehicular traffic to move in variable velocity momentum, managed by a controlled network protocol protecting datas composition integrity flow;
wherein system further comprises replacing existing traffic control lights and there associated control mechanisms and stop signs with at least one tVectorHub, VectorHubs combined with Sentry or Sentinel hubs sensors or combinations thereof, transmitting encrypted timing adjustments to adjust speed, flow rates and density for at least one vehicle transport within one or more domains fink sync'd in communication with a plurality of vehicles, transports, mechanism's and other certified complaint devices maneuvering or traversing on road way, including on toll roads; and
further comprising each network device sensor are is connected to a plurality of ecomm-devices having a direct data transmission corn-link from and to a plurality of systems servers, system and other mobile and stationary ecomm-device sensors and other certified compliant device sensors communications comprise encapsulated digital encrypted data, including voice or virtual digital commands or both to one or more traffic lights or tVectorhubs or both at the same time for any given intersection, and other certified ecomm-devices, each system device is characterized including system construct transmitting a plurality of ecomm-advice; and
comprises at least one Nuclex operating system, comprised of at least one non-transitory machine computer-readable medium configured with a plurality of storage mediums, and further comprises a OS computer device having at least one non-transitory machine computer-readable medium configured with a plurality of storage mediums to perform at least one: process, task, detection of approaching vehicle and transports and system communications anomalies, intrusions, hacks or other maliciousness that disrupts system, records and transmits to system ecomm-devices ability to function properly, detecting non-engagement and non registered vehicle and transports, improper code injections, to compute at least one algorithm, to perform mathematical equations, to transmit, receive from and to system and from and to the ecomm-device and other certified compliant device sensors generate digital directives, to advise; to initiate 911 service activated by a human from the smart certified complaint ecomm-device, for cyclers, runners and motorcycles or an ecomm-device activated by vehicle transport each providing locational services and the location at time of 911 activation, to respond to a transponder signal of a plurality of frequencies and to initiate a transponder signal and ecomm-advice directives transmitted to system; and
further each system, ecomm-devices and other certified compliant device sensors and modules comprise at least one or more programmed events, processes, tasks, procedures, a plurality of decisions, detecting, perform calculable equations, acknowledging a plurality of transmissions, to respond to a transponder or to activate a transponder signal, to initiate or activate 911 calls and one or more parts thereof are hard coded into at least one iChipSet;
wherein each OS comprises one or more iChipSets constructed and independent of each other in its functionality, purpose, programmed procedures, time stamp data entry and existing each iChipset and a plurality of devices, or sensors, to perform one or more processes, calculations, events, to transmit and receive, to perform one or more tasks, to make or activate a decision and one or more parts are combined with each other within the certified compliant ecomm-devices;
wherein system and vehicles ecomm-devices comprises a configured generated GPSGIS Virtual Telematic Architecture in a 2D, 3D or a 4D toggled interface viewed by touch, voice or both on iNavX2's virtual interface, displaying all exiting and contemplated traffic signage, including representations for speed, spatial locations of surrounding vehicles with audible and visual navigational maneuvers including a plurality of emergency notifications and dimensional mapping locational service as requested by driver or a human, from system derived protocols, from the vehicle along with roads change display and visually indicating and recording ecomm-device vehicle is receiving transmissions from or to other vehicles (V2V) and system, infrastructure to vehicle (I2V) or V2I including a human to vehicle (H2V) and V2H.
| 2. The secure navigational system of claim 1, wherein comprises one or more of a plurality of remote stationary and mobile communication hubs, devices, nodes or virtual nodes and a plurality of encrypted communication devices sensors, each comprised of at least one computer-executable instruction, further comprised of one or more programs hardcoded within a non-transitory machine computer-readable medium with a plurality of storage mediums configured within a plurality of ecomm-devices, each comprised of at least one RFIDGPS transponder/receiver/transmitter iChipset, better known as Avics iChipset; structured as a hardware, a downloadable hardware software application, a software application for smart devices or combination thereof to communicate with system, a plurality of Sync'd devices and systems servers, for the purpose of to transmit navigational directives and locational i... | The method involves transmitting/receiving encrypted digital communication-advice directives to/from traffic vectorhub class communication devices. Traffic density is continuously calculated in proportion to rate flow for vehicles (102) traveling along roadways based on dynamic data. Timing adjustments are transmitted over a network (110) to a traffic light, quantum vector nodes and virtual nodes formulating XY coordinates in an overlay model from certified vector communication-devices based on optimal traffic flow calculations determined by speed variations in relationship to the density. The dynamic data is traffic dynamic data. INDEPENDENT CLAIMS are also included for the following:a computer-apparatus structure for managing monitored vehicular aggregate traffic densitya computer-implemented procedure for managing monitored vehicular aggregate traffic density. Computer-implemented method for managing monitored vehicular aggregate traffic density. The method enables selectively powering down street light when traffic is less, thus reducing fossil fuel supply consumption rate by implementing Nxgen traffic system and allowing the vehicles to move as fast as possible without unnecessary idling, while optimizing exhausted energy consumed breaking inertia and speed bursts. The method enables integrating object functionality points or proximity integration to determine exact phase-change spatial relationship with each vehicle and allow prompt reactive response interval feeds into onboard vehicle processor, thus allowing each vehicle with certified communication-device to be provided with an ability to encapsulate logistical response times on preventative measures regarding accidental collisions and saving municipalities, states or countries significantly and reducing traffic expenditures, while making a safer traffic landscape, minimizing associated fatalities when accidental contact occurs with traffic signage and decreasing fuel consumption in manufacturing these items including material and labor costs for all signage. The drawing shows a block diagram of a system for managing and monitoring traffic density flow. '(Drawing includes non-English language text)' 102Vehicles104Receiving stations110Network112Computers114Databases |
Please summarize the input | Switching wireless network sites based on vehicle velocityThe disclosed technology proposes a new methodology to include the effect of speed and direction of a UE into the threshold used for determining when to switch between a 4G UL connection and a 5G UL connection. The system can use a lookup table with various speeds mapping to varying thresholds. The system can use an accelerometer sensor or digital compass to determine the direction of the vehicle, such as heading away from or toward the 5G site, so the vehicle can switch sooner from 5G-NR to LTE and from LTE to NR, respectively. For C-V2X applications, latency is an important factor because 5G technology provides shorter latency than 4G; thus keeping the link on 5G is preferred when under good coverage. Further, the idea is not limited to UL, 5G and/or vehicle technologies, but can also be applied to DL direction, Wi-Fi and/or drone technologies as well.The invention claimed is:
| 1. A method comprising:
obtaining, at a vehicle, signal quality data, from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, wherein the 5G and 4G sites provide at least a portion of a signal associated with the signal quality data;
obtaining, at the vehicle, a velocity of the vehicle relative to at least the 5G site;
creating a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle, increasing the dynamic signal quality threshold when the vehicle is moving away from the 5G site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the 5G site; and
switching a cellular network connection of the vehicle between the 5G site and the 4G site based on the dynamic signal quality threshold.
| 2. The method of claim 1, comprising:
obtaining map data for the cellular network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle;
determining a location where the signal quality data associated with the 5G site causes an interruption in a connection between the vehicle and the 5G site based on the map data and the future path of the vehicle; and
switching the connection from the 5G site to the 4G site before the vehicle reaches the determined location.
| 3. The method of claim 1, wherein creating the dynamic signal quality threshold further comprises increasing the dynamic signal quality threshold when the vehicle is in communication with the 4G site and decreasing the dynamic signal quality threshold when the vehicle is in communication with the 5G site.
| 4. The method of claim 1, wherein creating the dynamic signal quality threshold further comprises obtaining a table correlating a speed of the vehicle and a direction of motion of the vehicle with one of multiple signal quality thresholds.
| 5. The method of claim 4, comprising determining a function correlating the speed of the vehicle and the one of multiple signal quality thresholds based on the table.
| 6. The method of claim 1, wherein the signal quality data comprises a signal to interference plus noise ratio (SINR), a reference signal received power (RSRP), a bit error rate, or a packet error rate.
| 7. The method of claim 1, wherein switching the cellular network connection of the vehicle comprises:
switching the cellular network connection of the vehicle between the 5G site and the 4G site when the signal quality data is below the dynamic signal quality threshold.
| 8. At least one non-transient computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method comprising:
Response to Office Action dated Mar. 15, 2021 obtaining, at a vehicle, signal quality data, from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, wherein the 5G and 4G sites provide at least a portion of a signal associated with the signal quality data;
obtaining, at the vehicle, a velocity of the vehicle relative to at least the 5G site;
providing a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle, increasing the dynamic signal quality threshold when the vehicle is moving away from the 5G site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the 5G site; and
switching a cellular network connection of the vehicle between the 5G site and the 4G site based on the dynamic signal quality threshold.
| 9. The non-transient computer-readable medium of claim 8, comprising:
obtaining map data for the cellular network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle;
determining a location where the signal quality data associated with the 5G site causes an interruption in a connection between the vehicle and the 5G site based on the map data and the future path of the vehicle; and
switching the connection from the 5G site to the 4G site before the vehicle reaches the determined location.
| 10. A system comprising:
one or more processors;
memory coupled to the one or more processors, wherein the memory includes instructions executable by the one or more processors to:
obtain, at a vehicle, signal quality data, from a wireless network, wherein a first wireless network site and a second wireless network site provide at least a portion of a signal associated with the signal quality data;
obtain, at the vehicle, a velocity of the vehicle relative to at least the first wireless network site;
create a dynamic signal quality threshold by increasing the dynamic signal quality threshold when the vehicle is moving away from the first wireless network site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the first wireless network site; and
switch a wireless network connection between the vehicle, the first wireless network site or the second wireless network site based on the dynamic signal quality threshold.
| 11. The system of claim 10, the vehicle comprising a land vehicle, an aerial vehicle or a water vehicle.
| 12. The system of claim 10, the vehicle comprising an unmanned vehicle.
| 13. The system of claim 10, the wireless network comprising a Wi-Fi network, the first wireless network site and the second wireless network site comprising a Wi-Fi access point.
| 14. The system of claim 10, the wireless network comprising a cellular network, the first wireless network site comprising a fifth-generation wireless technology (5G) site and the second wireless network site comprising a fourth-generation wireless technology (4G) site.
| 15. The system of claim 10, the instructions further comprising the instructions to:
obtaining map data for the wireless network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle;
determining a location where the signal quality data associated with the first wireless network site causes an interruption in a connection between the vehicle and the first wireless network site based on the map data and the future path of the vehicle; and
switching the connection from the first wireless network site to the second wireless network site before the vehicle reaches the determined location.
| 16. The system of claim 10, wherein the instructions to create the dynamic signal quality threshold further comprise instructions to increase the dynamic signal quality threshold when the vehicle is in communication with the second wireless network site and decrease the dynamic signal quality threshold when the vehicle is in communication with the first wireless network site.
| 17. The system of claim 10, wherein the instructions to create the dynamic signal quality threshold further comprise instructions to operate an autonomous vehicle configured to wirelessly communicate with the first and second wireless network site and to measure a signal strength associated with the first and second wireless network site.
| 18. The system of claim 10, wherein the instructions further comprise instructions to:
obtain, at the vehicle, the signal quality data from the wireless network more frequently when a speed of the vehicle is above a speed threshold;
compare the obtained signal quality data to the dynamic signal quality threshold more frequently when the speed of the vehicle is above the speed threshold; and
switch the wireless network connection of the vehicle between the first wireless network site and the second wireless network site based on the comparison.
| 19. The system of claim 10, wherein the signal quality data comprises a signal to interference plus noise ratio (SINR), a reference signal received power (RSRP), a bit error rate, or a packet error rate.
| 20. The system of claim 10, wherein the instructions to switch the wireless network connection of the vehicle further comprise instructions to:
switch the wireless network connection of the vehicle between the first wireless network site and the second wireless network site when the signal quality data matches the dynamic signal quality threshold. | The switching method involves obtaining (600) signal quality data at a vehicle from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, where the 5G and 4G sites provide a portion of a signal associated with the signal quality data. A velocity of the vehicle relative to the 5G site is obtained (610), and created (620) a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle. The dynamic signal quality threshold is increased when the vehicle is moving away from the 5G site, and the dynamic signal quality threshold is decreased when the vehicle is moving toward the 5G site. A cellular network connection of the vehicle is switched (630) between the 5G site and the 4G site based on the dynamic signal quality threshold. INDEPENDENT CLAIMS are included for:(a) a non-transient computer-readable medium for switching wireless network sites based on vehicle velocity;(b) a system for switching wireless network sites based on vehicle velocity. Method for switching wireless network sites based on vehicle velocity. The processor switches the cellular connection of the vehicle between the 5G site and the 4G site when the cellular network signal quality data matches or is less than the dynamic signal quality threshold. The processor compares the obtained signal quality data to the dynamic signal quality threshold more frequently when the speed of the vehicle is above the speed threshold. The drawing shows a flowchart of a switching method.600Obtaining signal quality data at a vehicle from a fifth-generation wireless technology site and a fourth-generation wireless technology site of a cellular network 610Obtaining a velocity of the vehicle relative to the fifth-generation site 620Creating a dynamic signal quality threshold 630Switching a cellular network connection of the vehicle between the fifth-generation site and the fourth-generation site based on the dynamic signal quality threshold |
Please summarize the input | Hybrid mesh of licensed and unlicensed wireless frequency bandsThis disclosure describes techniques for creating a hybrid mesh of unlicensed wireless frequency bands between two or more vehicles communicating using an unlicensed wireless frequency band, and a massive MIMO base station communicating with the two or more vehicles using a licensed wireless frequency band. The hybrid mesh can be used to upload and download data from a vehicle in motion. The hybrid mesh can be formed via V2V connections between the vehicle and nearby vehicles. In other words, if a vehicle moves into a region outside the operating boundary of a 5G-NR massive MIMO base-station node, the vehicle can interact with other vehicles to generate a data pipeline using the unlicensed wireless frequency band from the vehicle the nearby vehicle, and using the licensed wireless frequency band from the nearby vehicle to the nearest, massive MIMO base station.The invention claimed is:
| 1. At least one non-transitory, computer-readable medium carrying instructions, which when executed by a data processing in a vehicle, perform a method to facilitate transmission or reception of a data file, the method comprising:
receiving, at the vehicle, a request to transmit the data file;
analyze the data file to determine a bandwidth for transmission;
determine that an available bandwidth associated with a licensed wireless frequency band is less than the determined bandwidth; and
initiate a hybrid mesh of an unlicensed wireless frequency band and the licensed wireless frequency band to transmit the data file,
wherein the hybrid mesh corresponds to at least one Vehicle-to-Vehicle (V2V) communication connection between the vehicle and at least one other vehicle, and with a base station associated with the licensed wireless frequency band; and
transmit the data file via the hybrid mesh.
| 2. The non-transitory, computer-readable medium of claim 1, further comprising instructions to:
obtain at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing the determined bandwidth for transmission;
determine a time required to transmit the data file;
determine whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the speed, the planned path, or the map of base stations; and
upon determining that the vehicle does not have access to the base station during the time required to transmit the data file, determine that the available bandwidth associated with the licensed wireless frequency band is less than the determined bandwidth.
| 3. The non-transitory, computer-readable medium of claim 1, wherein initiating the hybrid mesh further comprises instructions to:
obtain at least two of: multiple speeds associated with multiple vehicles, multiple planned paths associated with the multiple vehicles, or a map of base stations capable of offering the determined bandwidth for transmission;
determine a time required to transmit the data file;
determine whether one or more vehicles among the multiple vehicles has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the multiple speeds, the multiple planned paths or the map of base stations; and
upon determining the one or more vehicles among the multiple vehicles has access to the base station, initiate the hybrid mesh between the vehicle and the one or more vehicles.
| 4. The non-transitory, computer-readable medium of claim 1, further comprising instructions to:
obtain a planned path associated with the vehicle;
determine that a first group of vehicles along the planned path does not provide the determined bandwidth for transmission;
determine an alternate path associated with the vehicle that provides the determined bandwidth for transmission; and
suggest the alternate path to the vehicle, and
wherein determining the alternate path further comprises instructions to:
obtain a map of base stations along the alternate path, an available bandwidth associated with a base station along the alternate path, a second group of vehicles along the alternate path, and a bandwidth required to transmit the data file;
determine at least a portion of the second group of vehicles along the alternate path enabling transmission of the data file based on the available bandwidth, the bandwidth required and a location of the base station; and
suggest the alternate path to the vehicle.
| 5. The non-transitory, computer-readable medium of claim 1, further comprising instructions to:
obtain a planned path associated with the vehicle;
determine that a first group of vehicles along the planned path does not provide the determined bandwidth for transmission;
determine an alternate path associated with the vehicle that provides the determined bandwidth for transmission; and
suggest the alternate path to the vehicle.
| 6. The non-transitory, computer-readable medium of claim 1, further comprising instructions to:
obtain at least two of: a map of base stations, an available bandwidth associated with a base station in the map of base stations, or a bandwidth required to transmit the data file;
determine a speed of the vehicle for enabling transmission of the data file based on at least two of: the available bandwidth associated with the base station in the map of base stations, the bandwidth required, or a location of the base station; and
suggest the speed to the vehicle.
| 7. The non-transitory, computer-readable medium of claim 1, further comprising instructions to:
obtain at least two of: a map of base stations proximate to the vehicle, a planned path associated with the vehicle, or multiple planned or expected paths associated with multiple nearby vehicles;
determine that the hybrid mesh can be formed at a later point in time based on at least two of: the map of base stations, the planned path associated with the vehicle and the multiple planned or expected paths associated with the multiple nearby vehicles; and
delay transmission of the data file until the later point in time.
| 8. The non-transitory, computer-readable medium of claim 1, the vehicle comprising a low band transceiver to communicate with the at least one other vehicle, and a high band transceiver to communicate with the base station, wherein the vehicle is an autonomous vehicle, and wherein the instructions further comprise navigate the autonomous vehicle along an alternate path, or at a new speed from a current speed, to provide the determined bandwidth for transmission.
| 9. The non-transitory, computer-readable medium of claim 1, the unlicensed wireless frequency band comprising an IEEE 802.11s or 802.11ay standard.
| 10. At least one non-transient, computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method to facilitate transmission or reception of a data file for a vehicle, the method comprising:
creating a hybrid communication mesh to transmit from the vehicle, or receive at the vehicle, the data file,
wherein the hybrid communication mesh includes communicating with at least one other vehicle using an unlicensed wireless frequency band, and communicating with a massive MIMO base station communicating using a licensed wireless frequency band,
wherein creating the hybrid communication mesh includes determining that the vehicle is not within range of the massive MIMO base station and then creating the hybrid communication mesh of unlicensed wireless frequency bands between the vehicle and a nearest massive MIMO base station using a vehicle-to-vehicle (V2V) connection with the at least one other vehicle, and
wherein creating the hybrid communication mesh includes determining whether a communication parameter for the hybrid communication mesh satisfies a threshold; and,
transmitting the data file via the created hybrid communication mesh.
| 11. The non-transient, computer-readable medium of claim 10, wherein the communication parameter is an available bandwidth associated with the licensed wireless frequency band, and wherein the method further comprises:
obtaining at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing a determined bandwidth for transmission;
determining a time required to transmit the data file;
determining whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the speed, the planned path, or the map of base stations; and
upon determining that the vehicle does not have access to the base station during the time required to transmit the data file, determining that the available bandwidth associated with the licensed wireless frequency band is less than the determined bandwidth.
| 12. The non-transient, computer-readable medium of claim 10, wherein initiating the hybrid communication mesh further comprises:
obtaining at least two of: multiple speeds associated with multiple vehicles, multiple planned paths associated with the multiple vehicles or a map of base stations capable of offering a determined bandwidth for transmission;
determining a time required to transmit the data file;
determining whether one or more vehicles among the multiple vehicles has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the multiple speeds, the multiple planned paths or the map of base stations; and
upon determining the one or more vehicles among the multiple vehicles has access to the base station, initiating the hybrid communication mesh between the vehicle and the one or more vehicles.
| 13. At least one non-transient, computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method to facilitate transmission or reception of a data file for a vehicle, the method comprising:
receiving a request to transmit the data file to or from the vehicle;
obtaining at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing a selected bandwidth for transmission; and,
determining that a hybrid mesh can be created at a later point-in-time, and
deferring the transmission of the data file until the later point-in-time; or,
determining a different route as being more amenable to generating the hybrid mesh verses a current route, and
recommending the vehicle detour along the different route.
| 14. The non-transient, computer-readable medium of claim 13, wherein determining the different route as being more amenable to generating the hybrid mesh includes determining that more vehicles are on the different that are capable of participating in V2V communications than on the current route.
| 15. The non-transient, computer-readable medium of claim 13, further comprising: determining a time required to transmit the data file, and determining whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on the speed, the planned path, and the map of base stations.
| 16. The non-transient, computer-readable medium of claim 13, further comprising determining that an available bandwidth associated with a licensed wireless frequency band is less than the selected bandwidth.
| 17. The non-transient, computer-readable medium of claim 13, further comprising:
obtaining the planned path associated with the vehicle;
determining that a first group of vehicles along the planned path does not provide the selected bandwidth for transmission; and
determining the different route associated with the vehicle that provides the selected bandwidth for transmission.
| 18. The non-transient, computer-readable medium of claim 13, further comprising:
obtaining a map of base stations along the different route, an available bandwidth associated with a base station along the different route, a second group of vehicles along the different route, and a selected bandwidth to transmit the data file;
determining at least a portion of the second group of vehicles along the different route enabling transmission of the data file based on the available bandwidth associated with the base station along the different route, the selected bandwidth required and a location of the base station.
| 19. The non-transient, computer-readable medium of claim 13, further comprising:
obtaining a map of base stations proximate to the vehicle, the planned path associated with the vehicle and multiple planned paths associated with multiple nearby vehicles;
determining that the hybrid mesh can be formed at a later point in time based on the map of base stations, the planned path associated with the vehicle and the multiple planned paths associated with the multiple nearby vehicles; and
delaying transmission of the data file until the later point in time.
| 20. The non-transient, computer-readable medium of claim 13, further comprising:
measuring signal strength to the base station using a high band transceiver associated with the vehicle by periodically communicating with the base station; and
communicating the measurement to a nearby vehicle. | The system has a memory which is coupled to a processor. The processor receives a request to transmit a data package at vehicles (110,120,130,140) and analyzes the data package to determine a required bandwidth for transmission. The processor determines that an available bandwidth associated with a licensed wireless frequency band is less than the required bandwidth. The licensed wireless frequency band comprises a millimeter wavelength band. The processor initiates hybrid meshes (135,145) of an unlicensed wireless frequency band and the licensed wireless frequency band to transmit the data package. The hybrid mesh corresponds to multiple vehicle-to-vehicle (V2V) communication connections (115) between the vehicle. The vehicle and a base station (100) are associated with the licensed wireless frequency band and transmit the data package through the hybrid mesh. INDEPENDENT CLAIMS are included for the following:a non-transient computer-readable medium storing program for creating hybrid mesh of unlicensed wireless frequency bands; anda method for creating hybrid mesh of unlicensed wireless frequency bands. System for creating hybrid mesh of unlicensed wireless frequency bands between vehicles e.g. car communicating using unlicensed wireless frequency band. The method enables improving subscriber quality of experience (QoE), and obtains high-bandwidth communication between a device and a base station to impede effectiveness of a 5G-NR transmission for subscriber devices in motion. The method allows the vehicle to interact with other vehicles to generate a data pipeline using the unlicensed wireless frequency band from a vehicle to a nearby vehicle, and using the licensed wireless band from the nearby vehicle to the nearest, massive MIMO base station if the vehicle moves into a region outside the operating boundary of the 5G NR massive-multiple-input multiple-output (MIMO) base-station node. The drawing shows a schematic view of a hybrid mesh corresponding to communication between vehicles and a base station. 100Base station110,120,130,140Vehicles115Connection135,145Hybrid meshes160Building |
Please summarize the input | UAV supported vehicle-to-vehicle communicationThe use of unmanned aerial vehicle (UAV) communication cells in conjunction with MEC nodes may provide low-latency processing of vehicle movement data to generate vehicle guidance instructions for vehicles. Vehicle movement data of vehicles are received at a base station of a wireless carrier network from a UAV communication cell that is attached to the base station. The base station sends the vehicle movement data to a mobile edge computing (MEC) node that directly communicates with the base station so that the MEC node generates vehicle guidance instructions. The vehicle guidance instructions are then received by the base station from the MEC node. In turn, the base station sends the vehicle guidance instructions to the UAV communication cell for broadcasting to vehicles.What is claimed is:
| 1. A computer-implemented method, comprising:
receiving, at a mobile edge computing (MEC) node that is paired with a corresponding base station of a wireless carrier network, a report of a malfunction of a transceiver of the corresponding base station or a communication traffic overload of the corresponding base station from the corresponding base station, wherein the MEC node is deployed in addition to a server of a centralized data center;
deploying, by the MEC node that is paired with the corresponding base station and deployed in addition to the server of the centralized data center, an unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles in response to the malfunction of a transceiver of the corresponding base station or the communication traffic overload of the corresponding base station;
receiving vehicle movement data of one or more vehicles at the MEC node-that is paired with the corresponding base station from the UAV communication cell that is deployed by the MEC node to communicate with the one or more vehicles;
processing the vehicle movement data at the MEC node that is paired with the corresponding base station to generate vehicle guidance instructions for at least one vehicle; and
sending the vehicle guidance instructions from the MEC node that is paired with the corresponding base station to the UAV communication cell that is deployed by the MEC node for broadcasting to at least one vehicle.
| 2. The computer-implemented method of claim 1, wherein the MEC node is directed by a deployment controller of the wireless carrier network to deploy the UAV communication cell.
| 3. The computer-implemented method of claim 1, wherein the UAV communication cell is attached to the MEC node via a wired connection that provides power to the UAV communication cell and a communication link between the UAV communication cell and the MEC node.
| 4. The computer-implemented method of claim 1, wherein the vehicle movement data is received by the UAV communication cell from a plurality of vehicles or received by the UAV communication cell from an additional UAV communication cell via cellular vehicle-to-everything (CV2X) communication.
| 5. The computer-implemented method of claim 4, wherein the vehicle movement data is obtained by the additional UAV communication cell from the one or more vehicles via cellular vehicle-to-everything (CV2X) communication.
| 6. The computer-implemented method of claim 1, wherein the broadcasting is performed by an additional UAV communication cell that receives the vehicle guidance instructions from the UAV communication cell via cellular vehicle-to-everything (CV2X) communication.
| 7. The computer-implemented method of claim 1, wherein the corresponding base station communicates with the MEC node via a dedicated communication link that is independent of a backhaul that connects the corresponding base station to a core network of the wireless carrier network.
| 8. The computer-implemented method of claim 1, wherein the vehicle movement data for a vehicle includes a current vehicle location of the vehicle, a direction of travel of the vehicle, a speed of travel for the vehicle, an acceleration rate of the vehicle, a deacceleration rate of the vehicle.
| 9. The computer-implemented method of claim 1, wherein the vehicle movement data for a vehicle includes one or more of a current vehicle location of the vehicle, a direction of travel of the vehicle, a speed of travel for the vehicle, an acceleration rate of the vehicle, or a deacceleration rate of the vehicle.
| 10. The computer-implemented method of claim 1, wherein the vehicle guidance instructions are used by a vehicle to perform autonomous driving of the vehicle, perform an automatic driving maneuver, or provide a driving condition to a driver of the vehicle.
| 11. A mobile edge computing (MEC) node, comprising:
one or more processors; and
memory having instructions stored therein, the instructions, when executed by the one or more processors, cause the one or more processors to perform acts comprising:
receiving, from a neighboring MEC node that is paired with a base station of a wireless carrier network, a command for the MEC node to deploy an unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles, the MEC node being unpaired with any base station of the wireless carrier network;
deploying, by the MEC node, the unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles;
receiving vehicle movement data of one or more vehicles at the MEC node from the UAV communication cell;
processing the vehicle movement data at the MEC node to generate vehicle guidance instructions for at least one vehicle; and
sending the vehicle guidance instructions to the UAV communication cell for broadcasting to the at least one vehicle.
| 12. The MEC node of claim 11, wherein each of the one or more vehicles includes a vehicle control module that is authenticated by a core network of the wireless carrier network to communicate with the UAV communication cell, the vehicle control module to provide corresponding vehicle movement data to the UAV communication cell and receive vehicle guidance instructions from the UAV communication cell.
| 13. The MEC node of claim 11, wherein the UAV communication cell is deployed by the MEC node based on a deployment schedule to receive the vehicle movement data of the one or more vehicles.
| 14. The MEC node of claim 11, wherein a deployment command from the neighboring MEC node by the base station paired to the neighboring.
| 15. The MEC node of claim 11, wherein the UAV communication cell is deployed in response to a number of user devices that are connected to the base station exceeding a predetermined number threshold, vehicle traffic data provided by a traffic monitoring service indicating that a number of vehicles in a geographical area serviced by the base station exceeding a predetermined number threshold, the base station reporting a service outage, or a predetermined number of user devices serviced by the base station reporting a quality of service (QoS) value has dropped below or exceeded a threshold.
| 16. The MEC node of claim 11, wherein the vehicle movement data is obtained by an additional UAV communication cell from the one or more vehicles and relayed to the UAV communication cell, and wherein the sending includes sending the vehicle guidance instructions from the UAV communication cell to another communication cell that broadcast the vehicle guidance instructions to the at least one vehicle.
| 17. The MEC node of claim 11, wherein the receiving includes receiving the vehicle movement data from the UAV communication cell when the UAV communication cell or a distributed computing network of multiple UAV communication cells that include the UAV communication cell is unable to process the vehicle movement data into the vehicle guidance instructions in a predetermined amount of time.
| 18. One or more non-transitory computer-readable media of an unmanned aerial vehicle (UAV) communication cell storing computer-executable instructions that upon execution cause one or more processors to perform acts comprising:
receiving vehicle movement data of one or more ground vehicles at the UAV communication cell;
in response to determining that the UAV communication cell is unable to process the vehicle movement data into vehicle guidance instructions for at least one ground vehicle in a predetermined amount of time, transmitting at least a portion of the vehicle movement data to an additional UAV communication cell; and
in response to determining that the UAV communication cell is able to process the vehicle movement data into the vehicle guidance instructions in the predetermined amount of time, processing the vehicle movement data at the UAV communication cell into the vehicle guidance instructions for distribution to the at least one ground vehicle, where the vehicle guidance instructions include one or more automatic lane change directives for the at least one ground vehicle, one or more braking directives for the at least one ground vehicle, or one or more vehicle turning commands for the at least one ground vehicle.
| 19. The one or more non-transitory computer-readable media of claim 18, wherein the UAV communication cell and the additional UAV communication cell are part of a distributed computing network, and wherein the transmitting includes transmitting the vehicle movement data to a plurality of UAV communication cells of the distributed computing network for processing into the vehicle guidance instructions.
| 20. The one or more non-transitory computer-readable media of claim 18, wherein the additional UAV communication cell is connected to a mobile edge computing (MEC) node that processes the at least a portion of the vehicle movement data received by the additional UAV communication cell into the vehicle guidance instructions. | The method (400) involves receiving (402) vehicle movement data of several vehicles at a base station of a wireless carrier network from an UAV communication cell that is attached to the base station. The vehicle movement data from the base station is sent to a MEC node that directly communicates with the base station for the MEC node to generate vehicle guidance instructions for a vehicle. The vehicle guidance instructions are received (406) from the MEC node that is coupled to the base station. The vehicle guidance instructions are sent (408) to the UAV communication cell for broadcasting to a vehicle. INDEPENDENT CLAIMS are included for the following:a mobile edge computing (MEC) node; anda non-transitory computer-readable medium storing program for using UAV communication cell to support vehicle movement data processing by MEC node. Computer-based method for using unmanned aerial vehicle (UAV) communication cell to support vehicle movement data processing by mobile edge computing (MEC) node (claimed). The generation of the vehicle guidance instructions by a MEC node that is locally paired with a base station reduces communication latency. The capabilities enables the UAV communication cells to act as a mesh network or a distributed computing network to provide redundant communication and/or data processing capabilities. The drawing shows a flowchart illustrating the method for a base station to use a UAV communication cell to support vehicle movement data processing by a MEC node. 400Method for using a UAV communication cell to support vehicle movement data processing by a MEC node for a base station402Step for receiving vehicle movement data of several vehicles at a base station of a wireless carrier network from an UAV communication cell that is attached to the base station404Step for transmitting the vehicle movement data from the base station is sent to a MEC node that directly communicates with the base station for the MEC node to generate vehicle guidance instructions for a vehicle406Step for receiving the vehicle guidance instructions from the MEC node that is coupled to the base station408Step for sending the vehicle guidance instructions to the UAV communication cell for broadcasting to a vehicle |
Please summarize the input | V2X SUPPORTED UNmanned AircraftTechniques for performing traffic and road operations using communication between unmanned aerial vehicles (UAVs) and vehicles with vehicle-to-everything (V2X) support are described below. The UAVs can be used at different locations and can be functionally connected to a cell site, a Mobile Edge Computing (MEC) server and / or a V2X sensor. Upon detection of an incoming vehicle from a destination moving toward one or more autonomous vehicles (AVs), the UAVs can communicate directly with the one or more AVs to warn the AVs of the incoming vehicle. If the UAVs cannot communicate with the AVs, the UAVs can forward messages to other UAVs that can communicate with the AVs. The UAVs can also transport messages to a cell site and / or a V2X sensor to send messages to one or more AVs. Cell sites can also send messages to send the messages to the AVs within their respective coverage areas.|1. One or more non-volatile, computer-readable, unmanned aerial vehicle (UAV) media that store computer-executable instructions that, when executed, cause one or more processors to perform actions that include:
Detecting a presence of a first vehicle at a first position, the first vehicle moving toward a second position;
Generating a message for transmission to a second vehicle having a vehicle-to-all (V2X) capability in an environment of the second location, the message indicating the presence of the first vehicle; and
if the second vehicle is not in a communication area of ??the UAV, sending the message to a V2X sensor at the second position, the V2X sensor being configured to forward the message to the second vehicle via V2X communication.
| 2. One or more non-volatile computer readable media Claim 1, with the message being transmitted over a cellular air interface.
| 3. One or more non-volatile computer readable media Claim 1, wherein the first vehicle is a train and the second vehicle is an autonomous vehicle (AV).
| 4. One or more non-volatile computer readable media Claim 1, wherein the UAV is operatively connected to a cell site and the actions further include:
Transmission of the message to the cell site after determining that the second vehicle is not within a second communication area of ??the V2X sensor.
| 5. One or more non-volatile computer readable media Claim 5, wherein the UAV is connected to the cell site via a wired transport.
| 6. One or more non-volatile computer readable media Claim 1the actions further comprising:
Send the message to a second UAV at the second location after determining that the second vehicle is not within range of the communication.
| 7. One or more non-volatile computer readable media Claim 1wherein the second vehicle is further configured to transmit the message to a third vehicle via vehicle-to-vehicle (V2V) communication.
| 8. Computer-implemented method that includes:
Receiving a message from a UAV indicating the presence of a first vehicle at a first location, the first vehicle traveling to a second location; and
Transmitting the message indicating the presence of the first vehicle via a cell station to a second vehicle near the second location, the second vehicle having a Vehicle-To-All (V2X) capability.
| 9. Computer-implemented procedure according to Claim 8, further comprising the steps to:
Determining a period of time in which the first vehicle and the second vehicle arrive at an intersection at the second position based at least in part on a first speed of the first vehicle and a first distance between the first vehicle and the second position with respect to a second speed the second vehicle and a second distance between the second vehicle and the intersection; and
Generate a speed display for the second vehicle.
| 10. Computer-implemented procedure according to Claim 8, further comprising the steps to:
Transmitting the message to a second cell station when it is determined that the second vehicle is not in a communication area of ??the cell station, the second cell station being configured to transmit the message to the second vehicle.
| 11. Computer-implemented procedure according to Claim 8, further comprising the steps to:
Determining that the message is to be transmitted to a third vehicle that is not in a communication area of ??the cell station, the second vehicle configured to send the message to the third vehicle via vehicle-to-vehicle (V2V) communication transfer.
| 12. Computer-implemented procedure according to Claim 8, further comprising the steps to:
Determining that the second vehicle is not in a communication range of the cell station; and
Transmitting the message to a second UAV, the second UAV being configured to communicate with the second vehicle via a cellular air interface.
| 13. Computer-implemented procedure according to Claim 8, further comprising the steps to:
Transmission of the message to a vehicle-to-everything (V2X) sensor at the second location if the second vehicle is not in a communication area of ??the cell station.
| 14. System comprising:
one or more unmanned aerial vehicle (UAV) non-volatile storage media configured to provide stored code segments, the one or more non-volatile storage media coupled to one or more processors each configured to execute the code segments and cause the one or more processors to:
Detecting an existence of a first vehicle at a first location, the first vehicle traveling to a second location;
Generating a message for transmission to a second vehicle near the second location, the message indicating the presence of the first vehicle; and
if the second vehicle is not within a communication area of ??the UAV, sending the message to at least one device connected to a server at the second location after determining that the second vehicle is in a second communication area of ??the at least one device .
| 15. System according to Claim 14wherein the at least one unit is a vehicle-to-everything (V2X) sensor, a cell location, or a second UAV. | Computer-readable media includes executable instructions to detect (302) presence of vehicle at location capable to travel towards a second location. A message is generated to transmit (304) to a second vehicle having a vehicle-to-everything (V2X) capability in vicinity of the second location. The message indicates that the first vehicle is traveling towards the second location. When the second vehicle is not within a communication range of the UAV, the message is broadcasted (306) to V2X sensor at the second location. The V2X sensor is configured (310) to relay the message to the second vehicle using a V2X communication protocol. INDEPENDENT CLAIMS are also included for the following:a computer-implemented method; anda system. Computer-readable media in an system of automatic aerial vehicle for stable communications to monitor road conditions and detecting oncoming traffic. The message includes information that would enable an AV to make driving decisions. The control station can instruct each Unmanned Aerial Vehicle (UAV) to deploy to a target location to ensure coverage while minimizing overlapping. Cellular based V2X can provide a higher percentage of successful data packet delivery and communication range than WLAN based V2X, a vehicle must still be within a communication range of a target entity to enable successful passing of information or data packet delivery in V2X communication. The Mobile Edge Computing (MEC) server can provide computing resources, storage capacity, connectivity, and access to RAN information. The drawing shows a flow-chart of a method for utilizing UAVs and V2X sensors to conduct traffic and road operations. 302Detecting first location of a vehicle304Transmitting a message to second location306Broadcasting message to second location sensors310Communicating message to second vehicle |
Please summarize the input | NETWORK TESTING DETERMINATIONA fifth generation (5G) network can provide testing capabilities by employing a test server to generate requests for testing at remote locations of the 5G network. The test server can be used to initiate tests, determine test conditions for conducting the tests, direct test locations, and receive test data from test tools remote from the server. The test server can initiate, establish, maintain, format, or otherwise determine tests that are usable to improve operation of the 5G network.What is claimed is:
| 1. A method comprising:
configuring, by a server of a network, a message indicating a test location and a test condition associated with the network;
sending, by the server, the message to a test module included in a vehicle, the test module configured to:
receive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and
determine a capability of the network at the test location in accordance with the test condition based at least in part on the network data;
sending, by the server, the test location to the navigation component of the vehicle;
causing, by the server, the vehicle to navigate to the test location;
receiving, by the server and based at least in part on the vehicle navigating to the test location, test data from the test module indicating one or more capabilities of the network at the test location;
associating, by the server, metadata with the test data from the test module;
storing the test data and the metadata in a database; and
determining, by the server and based at least in part on using the metadata to access the test data from the database, network parameters associated with peak throughput for the one or more UEs accessing the network.
| 2. The method of claim 1, wherein the test data is first test data and the test module is a first test module, and further comprising:
receiving, by the server, second test data from a second test module indicating one or more capabilities of the network at a second test location; and
determining, by the server and based at least in part on the first test data and the second test data, the network parameters,
wherein the network parameters comprising one or more of: a location, a signal strength, a beam Precoding Matrix Indicator (PMI) number, a frequency, a power, or a signal-to-noise ratio to transmit data between the one or more UEs and the network.
| 3. The method of claim 1, wherein the message indicates multiple test locations and multiple test conditions associated with testing at least one of: a Time Division Duplex (TDD) system, a Frequency Division Duplex (FDD) system, an Unmanned Aerial Vehicle (UAV) system, a Vehicle to Everything (V2X) system, or a multiple user multi-input multi-output (MU-MIMO) system.
| 4. The method of claim 1, wherein the network is a 5G network and the message is associated with multiple user multi-input multi-output (MU-MIMO) testing in the 5G network.
| 5. The method of claim 1, further comprising:
identifying interference between one or more Unmanned Aerial Vehicles (UAVs) and a base station; and
based at least in part on identifying the interference, at least one of:
mitigating interference between the one or more UAVs and the base station;
mitigating noise rise associated with the one or more UEs and the base station; or
causing the one or more UAVs or UEs to be inoperable.
| 6. The method of claim 1, further comprising:
determining, by the server, a first test and a second test different from the first test;
ranking, by the server, the first test relative to the second test based at least in part on a comparison between a first location associated with the first test and a second location associated with the second test; and
sending an indication of the ranking to the navigation component to cause the vehicle to navigate to the first location and the second location in accordance with the ranking.
| 7. The method of claim 1, wherein:
the test condition indicates to perform at least one of: an upload test, a download test, a combined upload and download test, a latency test, a jitter test, a packet rate test, a voice quality test, a video quality test, a backhaul test, or a handover test; and
the one or more capabilities of the network indicated by the test data comprises one or more of: downlink and uplink throughputs, latency, volume of data, time/location, amount of call attempts, accepts, failures, amount of handovers, mean-opinion-score (MOS), signal to interference plus noise ratio (SINR), modulation and coding scheme (MCS), signal strength, UE transmit power, cell site number, or a frequency.
| 8. A system comprising:
one or more processors; and
memory storing computer-executable instructions that, when executed by the one or more processors, cause the system to perform operations comprising:
configuring, by a server of a network, a message indicating a test location and a test condition associated with the network;
sending, by the server, the message to a test module included in a vehicle, the test module configured to:
receive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and
determine a capability of the network at the test location in accordance with the test condition based at least in part on the network data;
sending, by the server, the test location to the navigation component of the vehicle;
causing, by the server, the vehicle to navigate to the test location;
receiving, by the server and based at least in part on the vehicle navigating to the test location, test data from the test module indicating one or more capabilities of the network at the test location;
associating, by the server, metadata with the test data from the test module;
storing the test data and the metadata in a database; and
determining, by the server and based at least in part on using the metadata to access the test data from the database, network parameters associated with peak throughput for the one or more UEs accessing the network.
| 9. The system of claim 8, the operations further comprising causing the network to communicate with multiple user equipment (UEs) at a same time based at least in part on the test data.
| 10. The system of claim 8, wherein:
the message is associated with testing separation between two base stations; and
determining the network parameters further comprises determining a horizontal distance or a vertical distance between the two base stations and determining a frequency guard band between the two base stations.
| 11. The system of claim 8, the operations further comprising configuring the message based at least in part on an occurrence of one or more events, the one or more events comprising a drop call rate, a change in noise, or a pre-determined throughput value.
| 12. The system of claim 8, the operations further comprising:
receiving network data from a remote sensor; and
configuring the message based at least in part on the network data from the remote sensor meeting or exceeding a testing threshold.
| 13. The system of claim 8, wherein:
the navigation component included in the vehicle comprises a navigation device or a Vehicle to Everything (V2X) device; and
wherein the metadata indicates an upload time or a test time of the test data.
| 14. The system of claim 8, the operations further comprising:
determining one or more of: a bandwidth, a Time Division Duplex (TDD) ratio configuration, a frequency channel, a transmission power, a beamforming Precoding Matrix Indicator (PMI), or a distance between two test modules to provide peak throughput for the one or more UEs accessing the network.
| 15. The system of claim 8, the operations further comprising identifying, based at least in part on the test data, a Time Division Duplex (TDD) ratio configuration that enables a first operator and a second operator to communicate simultaneously in a TDD system.
| 16. The system of claim 8, the operations further comprising:
receiving an indication to test the network; and
sending, by the server, the message to the test module included in the vehicle based at least in part on the indication to test the network.
| 17. A method comprising:
receiving, by a server of a network, test data from a test module included in a vehicle, the test data indicating one or more capabilities of the network at a test location and in accordance with a test condition, the test module configured to:
receive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and
determine the one or more capabilities of the network based at least in part on the network data;
determining, by the server, metadata to associate with the test data, the metadata comprising the test location, the test condition, a test time, and first navigation data associated with the navigation component;
receiving, by the server, second navigation data from an additional navigation component included in an additional vehicle;
identifying, by the server, a sequence of remote tests to be performed at multiple test locations;
sending, by a server, a first message to the navigation component included in the vehicle indicating a first portion of the sequence of remote tests to perform at a first test location of multiple test locations; and
sending, by a server, a second message to the additional navigation component included in the additional vehicle indicating a second portion of the sequence of remote tests to perform at a second test location of multiple test locations.
| 18. The method of claim 17, wherein the test data is first test data and further comprising:
causing the vehicle to navigate to the first test location;
causing the additional vehicle to navigate to the second test location;
receiving second test data from at least one of the vehicle or the additional vehicle; and
determining, by the server and based at least in part on the first test data and the second test data, network parameters associated with peak throughput for the one or more UEs accessing the network.
| 19. The method of claim 17, further comprising:
determining, by the server and based at least in part on the test data, network parameters associated with peak throughput for the one or more UEs accessing the network; and
causing the network to communicate with multiple user equipment (UEs) at a same time based at least in part on the test data.
| 20. The method of claim 17, wherein the vehicle comprises at least one of: an autonomous vehicle or an Unmanned Aerial Vehicle (UAV). | The method involves configuring a test message (108) indicating test location and test condition associated with a network (104) by a test server (106) of network. The message is sent to a test module included in a vehicle (110) by the server. The test module determines capability of network at the test location in accordance with test condition based at least in portion on network data. The server sends test location to navigation component in vehicle, causes vehicle to navigate to test location, receives test data from test module indicating capabilities of network at test location based at least in portion on vehicle navigating to test location, associates metadata with test data from test module, stores test data and metadata in a database, and determines network parameters associated with peak throughput for UEs (112) accessing the network based at least in portion on using metadata to access test data from database. An INDEPENDENT CLAIM is included for a system for testing the integrity or reliability of a telecommunication network. Method for testing network using server to receive test data from remote test tools for testing integrity or reliability of telecommunication network such as fifth generation (5G) telecommunication networks. The method enables providing improved bandwidth and decreased response times to a multitude of devices that are connected to a network. The drawing shows a schematic diagram of a network environment in which devices can connect to a telecommunication network to implement the testing techniques.104Network 106Test server 108Test message 110Vehicle 112UE |
Please summarize the input | APPARATUS AND METHOD FOR TRANSMITTING VEHICLE INFORMATIONMethods and systems are provided for generating a signal based on the information of the information system and transmitting/broadcasting the signal to the other vehicle. For example, a method includes receiving, by a processor, an indication that an information system of a vehicle is activated and in response to receiving the indication, generating, by the processor, a signal based on the activated information system of the vehicle. The method further includes broadcasting the signal. | The method involves receiving (501) an indication that an information system of a vehicle is activated by a processor. A signal is generated (503) based on the activated information system of the vehicle by the processor in response to receiving the indication. The signal is broadcasted (505). The information system of the vehicle is activated by a driver of the vehicle or a driving system of the vehicle. The signal includes information associated with an action of the driver of the vehicle or the driving system of the vehicle. INDEPENDENT CLAIMS are included for the following:a system for processing information;a vehicle; anda non-transitory computer program product for generating signal based on information of information system and for transmitting/broadcasting signal. Method for generating signal based on information of information system and for transmitting/broadcasting signal to other vehicle (claimed). The information that vehicle make a brake is generated and transmitted by the signal generation and transmission system when a driver or the self-driving system of the vehicle activates the vehicle brake system. The risk of autonomous driving caused is reduced and when the visual line of sight of the vehicle is blocked and the vehicle is taking an action that will affect the vehicle and provide higher reliability by providing more and better information to the vehicle. The drawing shows a flowchart illustrating the method for generating a signal based on the information of the information system and for transmitting/broadcasting the signal. 501Step for receiving an indication that an information system of a vehicle is activated by a processor503Step for generating a signal based on the activated information system of the vehicle505Step for broadcasting the signal |
Please summarize the input | Data processing apparatus of V2X and LDM for autonomous vehicle systemThe information transmission device of the vehicle-to-object communication device and the local dynamic map device for an autonomous driving system according to the present invention includes: a V2X message transceiver for transmitting and receiving a V2X message; C-ITS message processing unit for processing C-ITS messages transmitted and received through the V2X message transmission and reception unit; an LDM message transceiver for transmitting and receiving an LDM message; an LDM message processing unit for processing LDM messages transmitted and received through the LDM message transceiving unit; An autonomous driving system UDP message sending/receiving processing unit for sending and receiving an autonomous driving system UDP message through Ethernet communication with the autonomous driving system; An autonomous driving system CAN message transmission/reception processing unit that transmits and receives an autonomous driving system CAN message through CAN communication with the autonomous driving system; A vehicle information collection management unit for collecting and managing vehicle information processed by the autonomous driving system CAN message transmission and reception processing unit; and a real-time information processing unit that processes information in real time; It may be composed of|1. V2X message transmitting and receiving unit for transmitting and receiving V2X messages;
C-ITS message processing unit for processing a C-ITS (Cooperrative Intelligent Transport Systems) message transmitted and received through the V2X message transceiver;
an LDM message transceiver for transmitting and receiving an LDM (local dynamic map device) message;
an LDM message processing unit for processing an LDM message transmitted and received through the LDM message transceiving unit;
An autonomous driving system UDP message sending/receiving processing unit for sending and receiving an autonomous driving system UDP message through Ethernet communication with the autonomous driving system;
an autonomous driving system CAN message transmission/reception processing unit for transmitting and receiving an autonomous driving system CAN message through CAN communication with the autonomous driving system;
a vehicle information collection management unit for collecting and managing vehicle information processed by the autonomous driving system CAN message transmission and reception processing unit; and a real-time information processing unit that processes information processed in the C-ITS message processing unit, the LDM message processing unit, the autonomous driving system UDP message transmission/reception processing unit, and the vehicle information collection and management unit in real time; A vehicle-to-things communication device for an autonomous driving system and an information delivery device for a local dynamic map device, characterized in that it comprises a.
| 2. The method of claim 1, wherein the real-time information processing unit inquires the nearest intersection from the intersection information based on the information processed by the C-ITS message processing unit and the LDM message processing unit, and acquires the inquired traffic light information of the nearest intersection. to provide the information to the autonomous driving system through the autonomous driving system CAN message transmission/reception processing unit.
| 3. The real-time traffic information data processed by the C-ITS message processing unit, the LDM message processing unit and the real-time information processing unit is stored in a real-time traffic information data storage device according to claim 1 or 2, and is processed by the vehicle information collection and management unit. The vehicle information data to be used and the vehicle information data of the autonomous driving system are configured to be stored in a vehicle information data storage device. | The device has a vehicle-to-things (V2X) message transmitting and receiving unit for transmitting/receiving V2X messages. A collaborative intelligent transport systems (C-ITS) message processing unit (112) processes a C-ITS message transmitted/received through a V2X message transceiver (110). A local dynamic map device (LDM) message is received/processed by a LDM message transceiving unit (120) . An autonomous driving system (AD) message transmission/reception processing unit sends/receive an AD message through Ethernet communication. A vehicle information collection management unit collects and manages vehicle information processed by an AD CAN message transmission and reception processing unit (140) . A real-time information processing unit (150) is utilized for processing information processed in the C-ITS message processing unit. Vehicle-to-object communication device (V2X) and local dynamic map device (LDM) information transmission device for an autonomous driving system. The information necessary for driving the vehicle among the information around the vehicle and regional dynamic information transmitted and received from the V2X OBU terminal and the LDM terminal is provided to the autonomous driving system by selecting and processing quickly and effectively. The drawing shows a schematic block diagram of a V2X and LDM information transmission device for an autonomous driving system (Drawing includes non-English language text).110V2X message transceiver112C-ITS message processing unit120LDM message transceiving unit140AD CAN message transmission and reception processing unit150Real-time information processing unit |
Please summarize the input | Transportation, transportation vehicle and autonomous vehicle, equipment rental, rental system that can be shared or that provides infrastructure services for vehicle owners to rent and share their own vehicles.Infrastructure that can be rented, shared, traded for invention, transportation, transportation vehicle, autonomous vehicle, transport box-cabin, equipment and system, providing infrastructure services for vehicle owners to rent/share their own vehicles and facilities, providing products and services that they can use in their own vehicles. It is about the infrastructure and methods of the rental/sharing system and logistics system, which creates and operates the logistics and transportation system with the infrastructure created.|1. REQUESTS
one. It is a rental system that provides transportation, transportation vehicles and autonomous vehicles, equipment rental, sharing or infrastructure services for vehicle owners to rent and share their own vehicles.
- users (67), user terminal (81) User ID/Payment Tools (65), Smart Hardware (45), station (79) or third party providers (83), where the entire system is managed automatically or manually by artificial intelligence algorithms, and Cloud or server-based control center (82) containing integration interfaces, Control Center Software and databases (82S) that enable communication with other systems (84),
- User Identity/Payment Tools (65) that enable users who want to receive service from the system to be defined in the system and that carry out or mediate payment collection transactions (65),
- into computing equipment such as mobile phones, tablets, computers, smart glasses, watches, brain, neuro-machine interfaces or autonomous machines, or third party hardware, which also contain user identification information, enable users to communicate with system elements and carry out their work that needs to be carried out in the system. embedded electronic circuit groups user terminal (81) and user terminal software (81 S) running on these hardware,
- located on transportation vehicles (106), transport boxes (117) or at the station (79) and enabling transportation vehicles to receive service, identify them in the system, station units (79), smart hardware (45B), User ID/Payment Tools (65), user Intelligent hardware (45) and identification hardware (51) or Slot hardware (50) that enable terminal (81) and other systems (84) to communicate and interoperate with transportation vehicles (106) and transport boxes (117),
- a parking lot, which ensures the safety of transportation vehicles (106), transport boxes (117), receiving the desired services, and being present at the desired location.
171 The station (79), whose infrastructure includes the infrastructure that creates a stock area and allows users to interact with the system,
- Payment infrastructures that can receive offline and online payments that can communicate or work together with third party providers (83), smart hardware (45) and identification hardware (51) providing software, services and information that can be used in the system or processing third-party users and systems -pos devices,
- payment infrastructures (68),
- third party hardware, systems and applications (85),
- System software (90S),
- Artificial Intelligence Decision support systems (82.4) that perform learning, benchmarking, decision making and output production functions according to data and parameters coming from control center services and smart equipment and user terminals in the field,
- In case of intelligent equipment (45) station power and signal line (59) of stations with more than one platform, which are independently or in the system within the smart hardware (45), the multiplexed connections will duplicate their ports (31) and switch to other smart equipment. T is characterized in that it includes a link conversion unit (37) comprising a link function unit (36).
| 2. It is the users (67) mentioned in Claim 1 and its feature is; It is characterized by the standard user (67S) representing the legal entity or acting on its behalf, and autonomous users (670) with autonomous mobility.
| 3. It is the users (67) mentioned in Claim 2 and its feature is;
- Undefined user role (67R1)
- Tenant role (67R2)
- Role of user, vehicle or vehicle owner receiving service from the system (67R3)
- Rental vehicle owner role (67R4)
- System attendant role (67R5)
- It is characterized by containing the hardware-software user role (67R6).
| 4. It is the user identification information mentioned in Claim 1 and its feature is;
172
- User information, biometric data (67D),
- Credentials and session keys (83D) obtained from third party identity providers (83)
- Identity/Payment Instruments information and keys (65D),
- User terminal identification information (81 D),
- Characterized by having smart hardware/hardware/slot hardware identifiers (45D, 45BD, 51D, 50D).
| 5. It is the Control Center Software and databases (82S) mentioned in Claim 1, and its feature is;
- System Interfaces (82.1)
- Integration service between systems (90) (82.2)
- Communication and Security Services and their structures (82.3)
- Artificial Intelligence Decision support systems (82.4)
- Map based Services and Routing Services (82.5)
- Autonomous Vehicle and Equipment Management Services (82.6)
- Fleet and Operations Management services (82.7)
- System hardware, tools and third party hardware and software update services and software templates database (82.8)
- Promotion and Marketing Automation Services (82.9)
- Log database (82.10) prepared according to system and operation and purpose
- Business partners and business models services and database(82.11)
- Automatic and manual system management services, system databases, report services and interfaces, (82.12)
- Blockchain transaction interfaces, services and integration interfaces (82.13)
- Integration services (82.14)
- It is characterized by its content.
173
| 6. It is the transportation vehicles (106) mentioned in claim 1, and its feature is; It is characterized by having at least one of the qualities of manned or unmanned, motorized or non-motorised, autonomous or conventional vehicle, transportation vehicle (106S) and Host vehicle (106K), used to carry a person, object or perform any task.
| 7. It is the transportation vehicle (106S) mentioned in claim 6 and its feature is; It is characterized by having at least one of the smart hardware (45), identification hardware (51), Slot hardware (50), third-party locking system or third-party identification units on it.
| 8. The Mansion (106K) mentioned in Claim 6 is the vehicle and its feature is;
- Intelligent hardware (45), identification hardware (51), Slot hardware (50), wireless identification unit, third-party identification unit, third-party locking system, or at least one or more of the lights or signals providing audio-visual identification simultaneously to have
- Contain at least one of the smart hardware, Slot hardware, or third-party locking system hardware that enables it to transport more than one box, object or transport cabinet (117) with one vehicle at the same time
- It is characterized in that it contains at least one of the smart hardware, Slot hardware or third party locking system hardware that enables it to carry a transport box-object-cab (117) or interlocked box-object and cabinets (117) simultaneously with another vehicle (106) according to requirements.
| 9. It is a transport box (117) consisting of boxes, cabins, passenger cabins, objects or machines with or without active features (electronic, software, electromechanical) produced for specific or general use as mentioned in Claim 1, and its feature is;
- Intelligent hardware (45), identification hardware (51), Slot hardware (50) or third-party lock systems, which include box-object transport mode (45S10) software on the host vehicle (106K) or stations, identification hardware , Slot hardware, barcode/datacode/visual marking, or third-party lock
174 It is characterized in that it contains at least one of its systems.
| 10. It is the Slot hardware (50) mentioned in Claim 1 and its feature is;
- It contains an automatic/manually lockable nail slot (48) where it can be inserted into the tabs in the smart hardware (45) or third-party lock system,
- In case of power, signal and peripheral connections (49) where the slot hardware (50) is mounted (vehicle, platform hardware, third party equipment, mounting brackets, carrying apparatus-transport boxes or any floor or wall) with smart hardware ( 45) to provide these connections bidirectionally,
- Creating the Slot hardware (50) Identification Hardware Memory Area / Identification information (50D) with the RFID tag ID on the slot hardware (50) or the serial number, QR code, visual signals or chips used for identification programmed on the tag,
- the hardware identification ID can be configured as a platform or station in the system so that it can be used by the system as a station identification ID,
- the hardware identification ID can be configured in the system as a vehicle or transport box so that it can be used by the system as a vehicle or transport box identification ID,
- contains magnets or magnetic fields on the ground to facilitate mounting on metal surfaces,
- Operation data storage and vehicle(106), transport box(117), smart equipment(45), user smart equipment(45), user terminal(81), station computer(89), other smart hardware(45) in the station(79) Slot hardware is characterized by containing memory space/identification information (50D), which allows it to be sent to the control center with .
| 11th. It is the identification equipment (51) mentioned in Claim 1 and its feature is;
- the information given by the chips providing a unique serial number on it, or the identification information obtained from the QR codes containing the identification information.
175 contain RFID tag serial number or programmed serial number, at least one of which creates a Cookie Memory Space / Cookies (51 D),
- the vehicle, the station, the equipment contain the chip and antenna (53) that give the Identification Hardware Memory Area / Cookies (51 D),
- It contains magnets with N and S poles that allow the surface of the smart hardware (45) to hold on the floor,
- includes a cut-resistant cable-chain attached to the equipment floor,
- It contains Locking Pin-arms (54) that can be used with the Smart Hardware (45) locking module (22), which is in a closed position with the weak magnet on its floor, and which provides locking by pulling towards the module when the locking module (22) approaches,
- Operation data storage and vehicle(106), transport box(117), smart hardware(45), user smart equipment(45), user terminal(81), station computer(89), other smart hardware(45) in station(79) It is characterized in that it contains the memory area / cookies (51 D) of the cookie hardware that allows it to be sent to the control center with ).
| 12. It is the station (79) mentioned in claim 1 and its feature is;
- consists of at least one of the fixed mobile station (79A), fixed station (79B), mobile station (79C), restricted area/point type station (79D) configurations,
- It is characterized by being included in at least one of the System Station (79S1) or Special Station (79S2) class.
| 13. It is the fixed mobile station (79A) mentioned in claim 12 and its feature is;
- Positioning the hardware and software that make up the station on the host vehicle (106K),
- With smart hardware (79.1), with identification hardware (79.2), with wireless identification hardware (79.3), with Slot hardware (79.4) and with autonomous switching hardware (79.5), with kiosk equipped (with kiosk computer and peripherals) where stations are installed together with kiosk computer and peripherals. 79.6) contain at least one or more of the station types at the same time,
176
- Vehicle mode (45S1), autonomous vehicle mode (45S7), object-box transport mode (45S10), third-party hardware mode (45S3), basic mode (45S10) smart hardware running software, identification hardware (51), Slot hardware ( 50) can be locked to existing system elements and third-party locking systems.
| 14. It is the fixed station (79B) mentioned in claim 12 and its feature is;
- Fixed to fixed floors and surfaces, whose locations and locations are always defined,
- Kiosk/ kiosk where stations with smart hardware (79.1), identification hardware (79.2), wireless identification hardware (79.3), slot hardware (79.4) and autonomous switching hardware (79.5) stations are installed together with the platform and kiosk computer and peripherals. It is characterized in that it includes at least one of the station types (79.6) equipped with platform.
| 15. It is the Mobile station (79C) mentioned in Claim 12 and its feature is; Information processing equipment or smart equipment that can create a coverage area with wireless communication in order to collect the vehicles, to carry out rental or other services and to follow the station signal of the vehicles, where the location of the user terminal or the vehicle that generates the signal is used as the station location. , 45B), vehicles within the scope of wireless communication, It is characterized by the fact that they can identify the transport boxes or the identification hardware (51), the Slot hardware (50), the smart hardware (45, 45B) are within the scope of RFID reading and the vehicles are created by identifying the transport boxes by reading the identification information provided by the RFID tags.
| 16. It is the restricted area/point type station (79D) mentioned in claim 12 and its feature is; It includes a hardware such as a third party lock (64) integrated into the smart hardware (45) on the vehicle or the transport box and informing the control center the coordinate where the vehicle or the transport box is located, or the integration interface of the control center (82) so that the location can be controlled and the station functions can be restored. and from the control center or the user terminal, the vehicle or
177 It is characterized by being a station type that covers an area or point whose coordinates are determined by the transport box owners.
| 17. It is a Station (79.6) with or without a Kiosk, where the stations mentioned in Claims 13 and 14 are installed together with the platform and kiosk computer and peripherals.
- station kiosk including software and databases (89S)
- contains smart hardware (45) working in gateway mode (45S4) connected to the station computer
- contain at least one of the smart hardware (45) or identification equipment (51) or Slot hardware (51) operating in platform mode (45s6) on platforms, if it contains smart hardware (45) operating in gateway mode (45S4) connected to the station computer
- contains at least one of the smart hardware (45) or identification hardware (51) or Slot hardware (51) operating in station mode (45s5) on platforms that can be directly connected to the station computer, if it does not contain intelligent hardware (45) operating in gateway mode (45S4) connected to the station computer
- in platform smart equipment (45), it is characterized by the fact that it contains a third party locking system if the hardware does not have a pin locking system module (69), a locking system module (22) containing the hall.
| 18. The station kiosk mentioned in Claim 17 is software and databases (89S) and its feature is;
- Creating system interface for users,
- Creating the unique IDs and identification information of the station and the equipment on the station platforms,
- Users can perform system registration, information update, data monitoring/reporting, service procurement, payment transactions,
- Carrying out offline or online car rental, leasing, returning the rented car, providing service,
- station equipment, smart equipment (45), identification equipment (51) and smart equipment (45) contained in vehicles (106) and transport boxes (117),
178 identification hardware (51), third party lock hardware (64), communication and management of necessary procedures,
- Execution of station installation and technical support procedures,
- Managing online and offline payment procedures,
- Managing the user, environmental warning and alarm system,
- Keeping a database of station equipment inventory and status, vehicle inventory and status, platform status and reservations, and all transaction information performed at the station and putting the necessary transactions into effect,
- Ensuring the connection with the control center system interface and synchronizing with the system databases and executing the system operations,
- Execution of control center procedures at the station, platforms and vehicles connected to the station,
- Making and broadcasting announcements/warnings and advertisements by the system during waiting times when there is no user session of the station computer,
- The route, tariff, location, host vehicle of the host vehicle (106) or the host vehicle used in the system (106K) with the information coming from the city, region and general transportation systems, third party providers(83), control center(82) the arrival time of the vehicle (106K) to the station (79) or the station (79) at the nearby/determined location, its capacity, the infrastructure in the host vehicle, the vehicle that can be rented, used, purchased (106s), transport box (117), or listing of services, route planning and, if necessary, making reservations, purchases or related transactions
- Execution and monitoring of all system(90) transactions for which the user(67) is authorized
- It is characterized by the execution of the communication procedure with the user terminal/software and hardware.
179
| 19. It is the smart hardware (45) mentioned in Claim 1, and its feature is;
- Containing smart firmware (45s) that enables one or more modes to run simultaneously,
- When the smart hardware is directly connected to the station kiosk computer, reading the users' identity / payment tools (65), establishing communication with the user wireless equipment, transmitting the alarms, status and transaction information of the station electronic equipment, smart hardware (45), and battery to the station computer, and enabling the management of these equipment, Initiating and managing the communication between the station computer and the smart equipment (45) on the platforms, running the communication protocols such as RS485/RS422 or other serial and parallel communication layers, when the communication protocol needs to select master / slave mode, the gateway Smart hardware is in Master Mode. or kiosk gateway mode (45S4), which allows it to work in slave mode or stay in listening mode,
- Performing the commands of the station kiosk software by communicating with the smart hardware operating in the gateway mode, the procedures for vehicle identification, user identification, payment information identification by communicating with the Smart hardware (45) identification hardware (51) or third party identification RFID tags and other identification equipment. to be fulfilled, Performing the locking and unlocking work on the platform by driving the motor system in the smart hardware (45) or in the third party lock system, by checking the status of the switches, RFID tags and hardware that enable the vehicle to be identified on the platform, in the third party lock system or in the smart hardware. the open, closed position of the lock, whether the vehicle parked on the platform is on the platform, Checking whether there are unauthorized unlocking attempts and whether the rented vehicle has left the platform or whether the parked vehicle is properly parked according to whether it is defined in the system and whether the return procedure has been completed.
180 User warning unit (58) (a warning that can appeal to all senses such as buzzer, led, loudspeaker, visual, auditory, etc.) or smart hardware (45) on the platform during rental, return and adding/removing vehicles to the system. control and operation of the systems according to the defined function groups, on the platform and on the vehicle, if there is a charging module in the smart hardware (45) and identification equipment (51) in the transport box (117), the vehicle equipment charging procedure is managed, if the vehicle is equipped with a sensor infrastructure on the transport box (117), in the smart hardware (45), identification equipment (51) or slot if the necessary sensor and control hardware connection is available in the hardware (50), reading the sensor information from the vehicle, If there is a payment information or payment hardware module in the smart hardware (45) and identification equipment (51) on the platform and on the vehicle, making payment and collection transactions, accepting the return process offline if the station computer or the smart hardware that plays the gateway cannot be accessed, when the platform computer and the system are reached. transfer the necessary information to the system, daisy chain, point to point, If one of the token ring topologies is used, the information coming from the previously located smart hardware on the station platforms is routed to the next smart hardware by making the necessary changes according to the process or without changing it, to apply the protocols determined by the system during installation, update and use, to apply the protocols that are in technical support mode and technical support personnel with their smart hardware. to communicate, to transmit fault information, to make software updates, execution of installation and troubleshooting procedures, creation of unique platform ID in the system using the ID information obtained from the smart hardware (45) system processor and circuit group and other hardware or software created cookies, the establishment of the unique platform ID in the system and the errors and failures in the platform hardware. keeping log information, sending malfunctions and errors to the control center via the station infrastructure, the actions taken, sent by the system
181 updates and parameters, the parameters determined during installation and production are stored in the local database and memory, if there is a positioning module, the station location and changes are notified to the control center and users, if the platform has slot hardware (50) and Peron smart hardware (45) has the necessary module reading the slot hardware (50) ID, mapping it to the smart hardware (45) ID, platform mode (45S6), which enables power and signal connections to be activated,
- Fulfilling the commands of the control center by communicating with the smart hardware (45), station computer (89) operating in the Kiosk Gateway Mode (45S4) or station mode (45S5), and the control center System Interfaces (82.1) using the communication module/unit (95), Vehicle identification by communicating with Intelligent hardware (45) identification equipment (51) or RFID tags and other identification equipment that provide third-party identification on the vehicle (106) and transport boxes (117), or by detecting the identification information with the visual data processing module (102), fulfillment of user identification, station identification, identification of payment information procedures, Carrying out the locking and unlocking job by driving the motor system in the smart hardware (45) or in the third party lock system, checking the status of the switches/sensors in the smart hardware or the third party lock system, RFID tags and hardware that enable the vehicle to be identified and unlocking the engine driven lock , its closed position, whether the vehicle parked on the platform is on the platform, whether there have been any unauthorized unlocking attempts, and the leased vehicle, Checking whether the transport box has left the platform or whether the parked vehicle is properly parked according to whether it is defined in the system and whether the return procedure has been completed, user warning systems (58) (buzzer, led, speaker) during rental, return and adding / removing vehicles to the system. warning systems that can appeal to all sensory organs such as visual, auditory, etc.) or user warning systems on smart hardware (45).
182 control and operation according to the function groups, if there is a power connection to the smart hardware and if there is a charging module in the smart hardware (45) and identification equipment (51) on the vehicle or the carrying box, if there is a charging module in the socket hardware, managing the charging procedure of the vehicle equipment, if there is a sensor infrastructure on the vehicle the vehicle, if the necessary sensor and control hardware connection is available in the smart hardware (45) and identification equipment (51), Reading the sensor information from the transport box (117), making payment information in the smart hardware and the smart hardware (45) and identification hardware (51), the socket hardware (50) or the payment and collection transactions if there is a payment hardware module, to the control center (82) ), Accepting offline return if the smart hardware (45) running the station computer (89) or kiosk gateway mode (45S4) or other station mode (45S5) cannot be accessed, transferring the necessary information when the system is reached, if daisy chain, point to point, token ring topologies are used, the information coming from the previous smart hardware on the station platforms is routed to the next smart hardware with or without making the necessary changes according to the process, to apply the protocols determined by the system during installation and update and use , Using ID information from hardware (45) system processor and circuit assembly (11) and other hardware running technical support mode (45S8) and technical support personnel communicating with smart hardware (45), transmitting fault information, performing software updates, executing installation procedures Creating the unique platform ID in the system (90) or station (79), keeping the log information of the transactions and errors that occur in the system and during the operations, and the failures in the hardware, sending malfunctions and errors to the control center, operations, updates and parameters sent by the system, storing the parameters determined during installation, during production in the local database and memory, communicating with user terminals wirelessly and user transactions, rental, return, vehicle addition, service receiving
183 Ensuring the execution of procedures, enabling Users(67) to run payment methods for payment/collection transactions from user ID/payment tools(65), smart hardware(45), socket hardware(50), identification hardware(51), third party hardware, 67), vehicles (106), payment-collection in transport boxes (117), electronic circuit group and software (99) or visual data processing unit containing Global location and indoor positioning technology, Determining the station location by utilizing the data using RFID readers and tags and notifying the location and changes to the control center, users, user terminal software (81S), other systems (84) and smart hardware (45), with the location information obtained, the vehicle (106), transport determining the location of the user terminal, performing the tracking and docking functions, If there is a slot hardware (50) on the platform and the platform smart hardware (45) has the necessary module, the station mode (45S5) that enables to read the ID of the slot hardware (50), match it with the smart hardware (45) ID, activate the power and signal connections. ),
- Connecting to smart equipment working with platform mode (45S6) and station mode (45S5), reading user identity/payment tools(65), authentication and payment transactions, and visible Light Channels and Warning Interface User interface mode (45S9) that enables interaction with the user as (39) or User Alert Unit(58),
- communication with other smart hardware (45) and identification equipment (51) and hardware connected by connecting or locking the socket hardware, Technical support user(67) via wireless or wired port to terminal(81) or Communication module/unit(95) or nearby network Connecting to System Interfaces (82.1) by means of means of accessing the data and systems of the vehicle or station or transport boxes (117) to which the smart equipment (45) is connected, connecting to sensors and subsystems, to the logs and transaction database on the hardware to which it is connected.
184 access, storage, transfer to the control center(82) and user terminal(81), computer, the hardware that is connected with the software and parameters in the memory of the hardware operating in technical support mode or downloaded in the control center, and error/fault correction procedures on the vehicle to which the hardware is connected, installation and automatic or manual execution of disassembly procedures, hardware-software update procedures, Providing energy to the hardware in case the connected hardware has the necessary modules during error correction, connecting to the vehicle technical support team, equipment, systems in case it serves the hardware in third party vehicles and creating an interface for error/fault correction operations, carrying out automatic or manual error removal procedures, technical support mode (45S8), which enables the execution of test procedures on the hardware it is connected to by emulating/simulating other hardware modes,
- Identification of the station / platform parked or taken from the park by communicating with the smart hardware (45) identification equipment (51) or RFID tags and other identification equipment that provide third party identification, and by detecting the identification information of the visual data processing unit (102) on the platforms, fulfillment of the procedures for transferring payment information, the driving modes determined and programmed by the manufacturer or the user in the control center interface and terminals, transferring vehicle behaviors, vehicle management data, routes and other preferences to the vehicle rented or owned by means of smart hardware, Connecting to the control center (82.1) system interfaces (82.1) via the communication infrastructure and with the Communication module / unit (95) on the vehicle smart hardware (45), receiving software and operational information updates, executing troubleshooting procedures, accumulating and generated logs and data
185 In case of rental, if the control center is a vehicle with a system interface, transfer to the system interfaces (82.1) or third party ser... | The infrastructure provides infrastructure services for vehicle owners to rent/share own vehicles and facilities, and provides products and services used in the own vehicles. The rental/sharing system and logistics system are set to create and operate the transportation system. Infrastructure rented, shared, traded for invention, transportation vehicle, autonomous vehicle, transport box-cabin, and equipment. The infrastructure of rental/sharing system and logistics system, creates and operates the logistics and transportation system. |
Please summarize the input | METHOD AND COMPUTING DEVICE FOR PLANNING AN AUTONOMOUS DRIVING OF A SUBJECT AUTONOMOUS VEHICLEA method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist is provided. And the method includes steps of: (a) a computing device [100], corresponding to a subject autonomous vehicle, instructing a planning module [150] to acquire recognition information on surrounding vehicles including (i) first vehicles capable of a V2X communication and (ii) second vehicles incapable of the V2X communication; (b) the computing device [100] instructing the planning module [150] to select an interfering vehicle among the surrounding vehicles; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model, and to modify current optimized route information in order to evade a potential interfering action, to thereby generate updated optimized route information of the subject autonomous vehicle.|1. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (a), the computing device [100] instructs a 1-st neural network included in the image processing module [140] to acquire at least one circumstance image, corresponding to at least one direction from the subject autonomous vehicle, through at least one camera installed to the subject autonomous vehicle, and to apply one or more 1-st neural network operations to the circumstance image, to thereby generate the recognition information and then to thereby transmit the recognition information to the planning module [150], wherein the recognition information includes at least part of (i) vehicle identifier information, (ii) vehicle exterior information, and (iii) vehicle relative location information from the subject autonomous vehicle.
| 2. The method of Claim 1, wherein, at the step of (b), the computing device [100] instructs the planning module [150] to (i) generate a scheduled direction vector by using scheduled direction information on a direction to which the subject autonomous vehicle is planned to move in a threshold time and generate one or more relative location vectors by using the vehicle relative location information corresponding to at least part of the surrounding vehicles, (ii) generate each of similarity scores between the scheduled direction vector and each of the relative location vectors, and (iii) select at least one specific surrounding vehicle, among the surrounding vehicles, as the interfering vehicle, whose specific similarity score is larger than a threshold similarity score.
| 3. The method of Claim 1, wherein, at the step of (b), the computing device [100] instructs the planning module [150] to select at least one specific surrounding vehicle, whose corresponding partial image is located in a current lane region, corresponding to a current lane of a road including the subject autonomous vehicle, of the circumstance image, as the interfering vehicle, by referring to information on locations, of bounding boxes including the surrounding vehicles, on the circumstance image, which is acquired by using the image processing module [140].
| 4. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100] , if the interfering vehicle belongs to the second group, (i) instructs the image processing module (i-1) to acquire a modeling image including the interfering vehicle through at least one camera installed to the subject autonomous vehicle and (i-2) to apply one or more 2-nd neural network operations to the modeling image, to thereby generate acceleration capability information of the interfering vehicle, and then to thereby transmit the acceleration capability information to the planning module, and (ii) instructs the planning module to generate the potential interference prediction model by referring to the acceleration capability information and current velocity information of the interfering vehicle acquired by using at least one of the image processing module [140] and the V2X communication module [130].
| 5. The method of Claim 4, wherein, at the step of (c), the computing device [100] instructs a 2-nd neural network included in the image processing module [140] to apply one or more (2-1)-st neural network operations, among the 2-nd neural network operations, to the modeling image by additionally referring to a self-vehicle velocity information of the subject autonomous vehicle, to thereby generate (i) relative velocity information of the interfering vehicle in relation to the subject autonomous vehicle, (ii) category information corresponding to a class of the interfering vehicle, and (iii) acceleration variable information corresponding to at least part of a mass and a volume of the interfering vehicle, and instructs the 2-nd neural network to apply one or more (2-2)-nd neural network operations, among the 2-nd neural network operations, to a concatenated vector including the relative velocity information, the category information and the acceleration variable information as its components, to thereby generate the acceleration capability information of the interfering vehicle.
| 6. The method of Claim 5, wherein, at the step of (c), the computing device [100] instructs the 2-nd neural network to apply the (2-2)-nd neural network operations to the concatenated vector, further including current section average velocity information on an average velocity of vehicles in a current section of a road where the surrounding vehicles and the subject autonomous vehicle are driving currently, to thereby generate the acceleration capability information.
| 7. The method of Claim 4, wherein the computing device [100] instructs the planning module [150] to generate the potential interference prediction model by referring to the acceleration capability information and the current velocity information, generated by referring to velocity ratio information of the interfering vehicle and current section average velocity information, wherein the velocity ratio information has been generated by comparing each of average velocities for each of past sections of a road, where the surrounding vehicles and the subject autonomous vehicle have been driving, with each of velocities of the interfering vehicle in each of the past sections, and transmitted from a center server to the V2X communication module [130], and the current section average velocity information has been generated by calculating an average velocity of vehicles in a current section of a road where the surrounding vehicles and the subject autonomous vehicle are driving currently, and transmitted from the center server to the V2X communication module [130].
| 8. The method of Claim 4, wherein the computing device [100] instructs the 2-nd neural network to apply the 2-nd neural network operations to the modeling image, to thereby generate the current velocity information along with the acceleration capability information and then to thereby transmit the current velocity information and the acceleration capability information to the planning module, and instructs the planning module [150] to generate the potential interference prediction model by referring to the current velocity information and the acceleration capability information.
| 9. The method of Claim 4, wherein the computing device [100] instructs the planning module [150] to generate estimated velocity range information by referring to (i) a TTC value corresponding to a time for the subject autonomous vehicle to evade the potential interfering action, (ii) the acceleration capability information and (iii) the current velocity information, to thereby generate the potential interference prediction model including the estimated velocity range information.
| 10. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100], if the interfering vehicle belongs to the second group, instructs the V2X communication module [130] to (i) acquire (i-1) acceleration capability information of the interfering vehicle, (i-2) velocity ratio information of the interfering vehicle generated by comparing each of average velocities for each of past sections of a road, where the surrounding vehicles and the subject autonomous vehicle have been driving, with each of velocities of the interfering vehicle in each of the past sections, and (i-3) current section average velocity information generated by calculating an average velocity of vehicles in the current section, from the center server, (ii) generate current velocity information of the interfering vehicle by referring to the velocity ratio information and the current section average velocity information, and (iii) generate the potential interference prediction model by referring to the current velocity information and the acceleration capability information.
| 11. The method of Claim 1, wherein, at the step of (c), the computing device [100], if the interfering vehicle belongs to the first group, instructs the V2X communication module [130] to acquire scheduled route information of the interfering vehicle by communicating with the interfering vehicle, and instructs the planning module [150] to generate the potential interference prediction model by referring to the scheduled route information.
| 12. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100] instructs the planning module [150] to acquire lane average velocity information of at least one surrounding lane located in at least one of a left side and a right side of a current lane including the subject autonomous vehicle, and to modify the current optimized route information in order to add an evading action, to be executed in correspondence with the surrounding lane in order to evade the potential interfering action, by referring to the lane average velocity information, to thereby generate the updated optimized route information.
| 13. A computing device [100] for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising: at least one memory [115]; and at least one processor [120] configured to perform processes of: (I) instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (II) instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (III) instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the process of (I), the processor instructs a 1-st neural network included in the image processing module [140] to acquire at least one circumstance image, corresponding to at least one direction from the subject autonomous vehicle, through at least one camera installed to the subject autonomous vehicle, and to apply one or more 1-st neural network operations to the circumstance image, to thereby generate the recognition information and then to thereby transmit the recognition information to the planning module [150], wherein the recognition information includes at least part of (i) vehicle identifier information, (ii) vehicle exterior information, and (iii) vehicle relative location information from the subject autonomous vehicle.
| 14. The computing device [100] of Claim 13, wherein, at the process of (II), the processor instructs the planning module [150] to (i) generate a scheduled direction vector by using scheduled direction information on a direction to which the subject autonomous vehicle is planned to move in a threshold time and generate one or more relative location vectors by using the vehicle relative location information corresponding to at least part of the surrounding vehicles, (ii) generate each of similarity scores between the scheduled direction vector and each of the relative location vectors, and (iii) select at least one specific surrounding vehicle, among the surrounding vehicles, as the interfering vehicle, whose specific similarity score is larger than a threshold similarity score.
| 15. The computing device [100] of Claim 13, wherein, at the process of (II), the processor instructs the planning module [150] to select at least one specific surrounding vehicle, whose corresponding partial image is located in a current lane region, corresponding to a current lane of a road including the subject autonomous vehicle, of the circumstance image, as the interfering vehicle, by referring to information on locations, of bounding boxes including the surrounding vehicles, on the circumstance image, which is acquired by using the image processing module [140].
| 16. A computing device [100] for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X <span id='highlight_communication' style='background-color: #ffff00'>commu... | The method involves instructing a planning module (150) to acquire recognition information on at least portion of surrounding vehicles by a computing device (100) includes at least a portion of a first group with one or more first vehicles that capable of a V2X communication and located closer than a threshold distance from a subject autonomous vehicle corresponding to the computing device. A second group includes one or more second vehicles that are incapable of the V2X communication and located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module (130) and an image processing module (140). The computing device is instructed on the planning module to selects at least one interfering vehicle among the surrounding vehicles. An INDEPENDENT CLAIM is included for a computing device for planning an autonomous driving by using a V2X communication and image processing under a road circumstance. Method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance. Method ensures the computing device that may instruct the planning module to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, estimated by using the potential interference prediction model by generates updated optimized route information of the subject autonomous vehicle. The drawing shows a schematic representation of a method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance. 100Computing device120Processor130V2X communication module140Image processing module150Planning module |
Please summarize the input | METHOD AND DEVICE FOR ATTENTION-DRIVEN RESOURCE ALLOCATION BY USING REINFORCEMENT LEARNING AND V2X COMMUNICATION TO THEREBY ACHIEVE SAFETY OF AUTONOMOUS DRIVINGA method for achieving better autonomous driving performance while saving computing power by using a confidence score representing reliability of object detection, generated in parallel with the object detection process, comprising: (a) a computing device , acquiring at least one situation image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle; (b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; and (c) the computing device, through V2X communication with at least some of the surrounding objects and support of the reinforcement learning agent, with reference to the initial object information and the initial confidence information, final object information for the context image A method comprising; generating; is provided.|1. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;
(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the generating final object information for the context image; but, in step (c), the computing device causes the reinforcement learning agent to: (i) the initial confidence information and basic meta information of the surrounding objects and one or more specific surrounding objects corresponding to one or more specific target areas on the situation image to which a complementary operation is to be applied, from among the surrounding objects, with reference to the sensor information of the image sensor, (ii) the specific surrounding objects; and to obtain supplementary information through the V2X communication, and (iii) to generate the final object information by adjusting the initial object information using the supplementary information.
| 2. delete
| 3. The method of claim 1, wherein when the initial confidence information, the basic meta information, and the sensor information are input to the reinforcement learning agent, the reinforcement learning agent (i) selects the one or more specific surrounding objects using its parameters. selecting, (ii) generating at least one reward with reference to the supplementary information, and (iii) learning at least a portion of the parameter with reference to the reward.
| 4. The method of claim 1, wherein the computing device causes the reinforcement learning agent to: (i) the relative position information and the scheduled path information of the surrounding objects included in the basic meta information, (ii) the sensor information included , FOV (Field-Of-View) information, internal parameter information, external parameter information and distortion information, and (iii) determining whether to select the one or more specific surrounding objects using at least some of the initial confidence information how to do it with
| 5. The method of claim 1 , wherein the computing device refers to supplementary information including at least a portion of reference object information and reference confidence information generated by the specific surrounding object, and specific metadata of the specific surrounding object as a reference, The final object information is generated by adjusting initial object information, wherein the specific surrounding object performs object detection on its own surrounding object to generate the reference object information and the reference confidence information.
| 6. The method of claim 1, wherein (d) the computing device transmits the final object information to the autonomous driving module, thereby causing the autonomous driving module to perform autonomous driving of the target vehicle using the final object information. to do;
Method, characterized in that it further comprises.
| 7. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;
(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the Including, but before the step (a), (a1) when a training image is obtained, the learning apparatus causes at least one convolutional layer included in the CNN to cause the applying a convolution operation to the training image at least once to generate at least one convolutional feature map;
(a2) while the learning apparatus performs a process of generating a predicted ROI (Region Of Interest) on the training image by applying an anchor operation to the convolutional feature map at least once, RPN (Region Proposal Network) By generating at least one anchor layer included in each at least one RPN confidence score representing each at least one probability that the prediction ROI is the same as a ground truth (GT) ROI for each pixel of the convolutional feature map generating an RPN confidence map including the RPN confidence score;
(a3) When at least one ROI-pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI-pooling layer included in the CNN, the learning apparatus, the ROI-pooled feature map While performing a process of generating a prediction object detection result using Each CNN confidence score indicating each at least one probability that the ratio result is predicted to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the GT object detection result is said generating a CNN confidence map including the CNN confidence score by generating for each predicted ROI; and (a4) the learning device causes the loss layer to at least one RPN loss and at least one CNN loss with reference to the RPN confidence map, the CNN confidence map, the prediction object detection result and the GT object detection result. and to learn at least some of the parameters included in the CNN and the RPN by performing backpropagation using the RPN loss and the CNN loss.
| 8. The method of claim 7, wherein in step (a4), the learning device causes the loss layer to generate the RPN loss according to the following equation,is a constant corresponding to the size of the convolutional feature map generated by performing an anchor operation,is a constant corresponding to the training image,is an i-th RPN confidence score corresponding to the i-th pixel of the convolutional feature map among the RPN confidence scores,means the i-th prediction RPN classification result corresponding to the i-th pixel,denotes an i-th GT RPN classification result corresponding to the i-th pixel,is the i-th prediction RPN regression result corresponding to the i-th pixel,denotes an i-th GT RPN regression result corresponding to the i-th pixel, and the i-th GT RPN classification result and the i-th GT RPN regression result correspond to the GT object detection result.
| 9. The method of claim 7, wherein in step (a4), the learning device causes the loss layer to generate the CNN loss according to the following equation, andis the number of the predicted ROI,is the i-th CNN confidence score corresponding to the i-th prediction ROI selected from the prediction ROI among the CNN confidence scores,means the i-th prediction CNN classification result corresponding to the i-th prediction ROI,is the i-th GT CNN classification result corresponding to the i-th pixel,Means the i-th prediction CNN regression result corresponding to the i-th prediction ROI,means an i-th GT CNN regression result corresponding to the i-th pixel, and the i-th prediction CNN classification result and the i-th prediction CNN regression result correspond to the prediction object detection result.
| 10. The method according to claim 7, wherein after step (a3), the learning device causes the confidence layer to refer to the RPN confidence map and the CNN confidence map, and for each integrated confidence score for each pixel in the training image. A method comprising generating a unified confidence map comprising information.
| 11. The method of claim 10, wherein the learning device, the confidence layer, (i-1) from the CNN, generated during the process of generating the prediction object detection result is performed, NMS (Non-NMS for the prediction ROI) -Maximum Suppression) a process of obtaining a result, (i-2) a process of generating a resized RPN confidence map by applying a Resize operation to the RPN confidence map at least once, and (ii) the NMS result and Method characterized in that to perform the process of generating the unified confidence map with reference to the resized RPN confidence map.
| 12. The method according to claim 11, wherein the learning device causes the confidence layer to coordinate coordinates on the training image among the integrated confidence scores.To generate an X_Y th integrated confidence score corresponding to the following formula,means the X_Y th integrated confidence score,is the coordinates on the resized RPN confidence mapMeans the X_Y-th resized RPN confidence score corresponding to,is determined from the NMS result, the coordinatesincludesMethod characterized in that it means the i-th CNN confidence score for the i-th predicted ROI expressed as .
| 13. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;
(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the generating final object information for the context image; but, in step (b), (b1) when the context image is obtained, the computing device causes at least one convolutional layer included in the CNN to generating at least one convolutional feature map by applying a convolution operation to the context image at least once;
(b2) causing the computing device to cause at least one anchor layer included in the RPN to perform a process for generating a predictive ROI on the context image by applying an anchor operation to the convolutional feature map at least once, generating an RPN confidence map comprising the RPN confidence score by generating, for each pixel of the convolutional feature map, each of at least one RPN confidence score, each of which represents at least one probability that a predicted ROI will be the same as the GT ROI;
(b3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, the computing device generates the ROI pooled feature map While performing a process of generating a prediction object detection result using Each of the at least one GT CNN classification result and the at least one GT CNN regression result included in the GT object detection result are each CNN confidence score indicating at least one probability that the ratio result is the same as each of the prediction ROIs. By doing so, generating a CNN confidence map including the CNN confidence score; and (b4) causing the computing device to generate an integrated confidence map with reference to the RPN confidence map and the CNN confidence map by the confidence layer operating in conjunction with the CNN;
Including, wherein the computing device causes the CNN to output the initial object information including the prediction object detection result and the initial confidence information including the integrated confidence map.
| 14. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of obtaining at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to A process of generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent including; in the process (III), the processor causes the reinforcement learning agent to: (i) the initial confidence information; With reference to the basic meta information of the surrounding object and the sensor information of the image sensor, one or more specific surrounding objects corresponding to one or more specific target areas on the context image to which a complementary operation is to be applied are selected from among the surrounding objects, (ii) to obtain supplementary information through the V2X communication with the specific surrounding object, and (iii) to generate the final object information by adjusting the initial object information using the supplementary information .
| 15. delete
| 16. The method of claim 14, wherein when the initial confidence information, the basic meta information, and the sensor information are input to the reinforcement learning agent, the reinforcement learning agent (i) selects the one or more specific surrounding objects using its parameters. and (ii) generating at least one reward with reference to the supplementary information, and (iii) learning at least a portion of the parameter with reference to the reward.
| 15. The method of claim 14, wherein the processor causes the reinforcement learning agent to: (i) included in the basic meta information, relative position information and scheduled path information of the surrounding object, (ii) included in the sensor information, It is characterized in that it is determined whether to select the one or more specific surrounding objects by using at least some of Field-Of-View (FOV) information, internal parameter information, external parameter information and distortion information, and (iii) the initial confidence information. device to do.
| 15 . The method of claim 14 , wherein the processor refers to supplemental information including at least a portion of reference object information and reference confidence information generated by the specific surrounding object, and specific metadata of the specific surrounding object as a reference, The final object information is generated by adjusting object information, and the specific surrounding object generates the reference object information and the reference confidence information by performing object detection on its own surrounding object.
| 19. The method of claim 14, wherein the processor (IV) transmits the final object information to the autonomous driving module, thereby causing the autonomous driving module to perform autonomous driving of the target vehicle using the final object information. A device, characterized in that it further performs a process.
| 20. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of obtaining at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to A process of generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent Including, but before the (I) process, (I1) when a training image is obtained, the learning device, a process of causing at least one convolutional layer included in the CNN to generate at least one convolutional feature map by applying a convolution operation to the training image at least once; (I2) The learning apparatus, while performing the process of generating a predicted ROI (Region Of Interest) on the training image by applying an anchor operation to the convolutional feature map at least once, RPN (Region Proposal Network) By generating at least one anchor layer included in each at least one RPN confidence score representing each at least one probability that the prediction ROI is the same as a ground truth (GT) ROI for each pixel of the convolutional feature map a process for generating an RPN confidence map comprising the RPN confidence score; (I3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, the learning device is configured to select the ROI pooled feature map While performing a process of generating a prediction object detection result using Each CNN confidence score indicating each at least one probability that the ratio result is predicted to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the GT object detection result is said a process of generating a CNN confidence map including the CNN confidence score by generating for each predicted ROI; and (I4) the learning device causes the loss layer to at least one RPN loss and at least one CNN loss with reference to the RPN confidence map, the CNN confidence map, the prediction object detection result and the GT object detection result. a process of learning at least a part of the parameters included in the CNN and the RPN by performing backpropagation using the RPN loss and the CNN loss; by performing, the CNN is learned device to do.
| 21. The method of claim 20, wherein in the process (I4), the learning device causes the loss layer to generate the RPN loss according to the following equation,is a constant corresponding to the size of the convolutional feature map generated by performing an anchor operation,is a constant corresponding to the training image,is an i-th RPN confidence score corresponding to the i-th pixel of the convolutional feature map among the RPN confidence scores,means the i-th prediction RPN classification result corresponding to the i-th pixel,denotes an i-th GT RPN classification result corresponding to the i-th pixel,is the i-th prediction RPN regression result corresponding to the i-th pixel,denotes an i-th GT RPN regression result corresponding to the i-th pixel, and the i-th GT RPN classification result and the i-th GT RPN regression result correspond to the GT object detection result.
| 22. The method of claim 20, wherein in the process (I4), the learning apparatus causes the loss layer to generate the CNN loss according to the following equation, andis the number of the predicted ROI,is the i-th CNN confidence score corresponding to the i-th prediction ROI selected from the prediction ROI among the CNN confidence scores,means the i-th prediction CNN classification result corresponding to the i-th prediction ROI,is the i-th GT CNN classification result corresponding to the i-th pixel,Means the i-th prediction CNN regression result corresponding to the i-th prediction ROI,denotes an i-th GT CNN regression result corresponding to the i-th pixel, and the i-th prediction CNN classification result and the i-th prediction CNN regression result correspond to the prediction object detection result.
| 23. The method according to claim 20, wherein, after the (I3) process, the learning device causes the confidence layer to refer to the RPN confidence map and the CNN confidence map for each integrated confidence score for each pixel in the training image. Device characterized in that for generating a unified confidence map comprising the information.
| 24. The method according to claim 23, wherein the learning apparatus causes the confidence layer to (i-1) generate the prediction object detection result from the CNN while the process of generating the prediction object detection result is performed. -Maximum Suppression) a process of obtaining a result, (i-2) a process of generating a resized RPN confidence map by applying a Resize operation to the RPN confidence map at least once, and (ii) the NMS result and Device characterized in that to perform the process of generating the unified confidence map with reference to the resized RPN confidence map.
| 25. The method according to claim 24, wherein the learning device causes the confidence layer to coordinate coordinates on the training image among the integrated confidence scores.To generate an X_Y th integrated confidence score corresponding to the following formula,means the X_Y th integrated confidence score,is the coordinates on the resized RPN confidence mapMeans the X_Y-th resized RPN confidence score corresponding to,is determined from the NMS result, the coordinatesincludesApparatus characterized in that it means the i-th CNN confidence score for the i-th predicted ROI expressed as .
| 26. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of acquiring at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to perform the situation A process for generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent ; but, the (II) process, (II1) when the situation image is obtained, a process of causing at least one convolutional layer included in the CNN to generate at least one convolutional feature map by applying a convolution operation to the context image at least once; (II2) At least one anchor layer included in the RPN, while performing a process for generating a prediction ROI on the context image by applying an anchor operation to the convolutional feature map at least once, causes the prediction ROI to be a GT ROI generating an RPN confidence map including the RPN confidence score by generating, for each pixel of the convolutional feature map, at least one RPN confidence score, each of which represents at least one probability to be equal to ; (II3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, a prediction object is detected using the ROI pooled feature map While performing the process of generating a result, the FC layer included in the CNN causes each at least one prediction CNN classification result and each at least one prediction CNN regression result included in the prediction object detection result to be a GT object By generating, for each prediction ROI, each CNN confidence score indicating at least one probability to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the detection result for each prediction ROI, the CNN confidence a process for generating a CNN confidence map including the score; and (II4) a process of causing the confidence layer operating in conjunction with the CNN to generate an integrated confidence map with reference to the RPN confidence map and the CNN confidence map; The apparatus characterized in that it outputs the initial object information including the object detection result and the initial confidence information including the integrated confidence map. | The method involves acquiring one circumstance image on surroundings of a subject vehicle by a computing device, through one image sensor installed on the subject vehicle. The computing device instructs a Convolutional Neural Network(CNN) to apply one Convolutional Neural Network operation to the circumstance image, to generate initial object information and initial confidence information on the circumstance image. The computing device generates final object information on the circumstance image by referring to the initial object information and the initial confidence information with a support of a Reinforcement Learning (RL) agent, and through V2X communications with portion of surrounding objects whose distances from the subject vehicle are smaller than a threshold. An INDEPENDENT CLAIM is included for a computing device for achieving better performance in an autonomous driving while saving computing powers, by using confidence scores representing a credibility of an object detection. Method for achieving better performance in an autonomous driving while saving computing powers, by using confidence scores representing a credibility of an object detection by a computing device (claimed). Reduces a consumption of the computing powers required for the autonomous driving. The drawing shows a block representation of a configuration of a computing device performing a method for an attention-driven resource allocation by using Reinforcement Learning and V2X communication, to achieve a safety of an autonomous driving. 110Communication portion115Memory120Processor140Region proposal network150Confidence layer |
Please summarize the input | LEARNING METHOD AND LEARNING DEVICE FOR INTEGRATING OBJECT DETECTION INFORMATION ACQUIRED THROUGH V2V COMMUNICATION FROM OTHER AUTONOMOUS VEHICLE WITH OBJECT DETECTION INFORMATION GENERATED BY PRESENT AUTONOMOUS VEHICLE, AND TESTING METHOD AND TESTING DEVICE USING THE SAMEA learning method for generating integrated object detection information by integrating first object detection information and second object detection information is provided. And the method includes steps of: (a) a learning device instructing a concatenating network to generate one or more pair feature vectors; (b) the learning device instructing a determining network to apply FC operations to the pair feature vectors, to thereby generate (i) determination vectors and (ii) box regression vectors; (c) the learning device instructing a loss unit to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN.|1. A learning method for generating integrated object detection information on an integrated target space including a first target space and a second target space, by integrating first object detection information on the first target space generated by a first vehicle and second object detection information on the second target space generated by a second vehicle, comprising steps of: (a) a learning device (100), if the first object detection information on the first target space and the second object detection information on the second target space are acquired by processing a first original image on the first target space and a second original image on the second target space, instructing a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors including information on one or more pairs of first original ROIs included in the first target space and second original ROIs in the second target space; (b) the learning device (100) instructing a determining network (220) included in the DNN (200) to apply one or more FC operations to the pair feature vectors, to thereby generate (i) one or more determination vectors including information on probabilities of the first original ROIs and the second original ROIs included in each of the pairs being appropriate to be integrated and (ii) one or more box regression vectors including information on each of relative 3-dimensional locations of integrated ROIs, corresponding to at least part of the pairs, comparing to each of original 3- dimensional locations of each component of said at least part of the pairs, on the integrated target space; (c) the learning device (100) instructing a loss unit (230) to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN (200).
| 2. The learning method as claimed in claim 1, wherein, at the step of (a), a specific pair feature vector, which is one of the pair feature vectors, includes (i) first class information of a first specific object included in the first target space, (ii) feature values of a first specific original ROI including the first specific object, (iii) 3-dimensional coordinate values of a first specific original bounding box corresponding to the first specific original ROI, (iv) 3- dimensional coordinate values of the first specific original ROI, (v) second class information of a second specific object included in the second target space, (vi) feature values of a second specific original ROI including the second specific object, and (vii) 3-dimensional coordinate values of a second specific original bounding box corresponding to the second specific original ROI, and (viii) 3-dimensional coordinate values of the second specific original ROI.
| 3. The learning method as claimed in claim 2, wherein, at the step of (b), a specific determination vector, which is one of the determination vectors and corresponds to the specific pair feature vector, includes information on a probability of the first specific original ROI and the second specific original ROI being integrated on the integrated target space, and a specific box regression vector, which is one of the box regression vectors and corresponds to the specific pair feature vector, includes information on 3- dimensional coordinates of a specific integrated bounding box generated by merging the first specific original ROI and the second specific original ROI on the integrated target space.
| 4. The learning method as claimed in claim 1, wherein, at the step of (c), the learning device instructs the loss unit (i) to generate a determination loss by using at least part of the determination vectors through a cross entropy method, (ii) to generate a box regression loss by using at least part of the box regression vectors through a smooth-L1 method, and (iii) to generate the integrated loss by referring to the determination loss and the box regression loss.
| 5. The learning method as claimed in claim 4, wherein, at the step of (c), the determination loss is generated by a formula below: Equation wherein n denotes the number of the determination vectors, vc i denotes an i-th determination vector, vC-GT i denotes an i-th determination GT vector on the i-th determination vector, and the box regression loss is generated by a formula below: Equation wherein n denotes the number of the box regression vectors, vc i denotes an i-th box regression vector, vC-GT i denotes an i-th box regression GT vector on the i-th box regression vector.
| 6. The learning method as claimed in claim 1, wherein the learning device instructs each of deep learning neurons included in one or more layers of the DNN to repeatedly apply one or more convolutional operations to its input by using its own at least one parameter and deliver its output to its next deep learning neuron, to thereby generate the pair feature vectors, the determination vectors and the box regression vectors.
| 7. The learning method as claimed in claim 1, wherein, at the step of (b), the learning device instructs the determining network included in the DNN to generate the determination vectors by applying at least part of the FC operations to the pair feature vectors, and to generate the one or more box regression vectors corresponding to one or more specific pair feature vectors, among the pair feature vectors, whose values in corresponding specific determination vectors denoting specific probabilities of specific pairs to be integrated are larger than a prescribed threshold, by applying the other part of the FC operations to the specific pair feature vectors.
| 8. A testing method for generating integrated object detection information for testing on an integrated target space for testing including a first target space for testing and a second target space for testing, by integrating first object detection information for testing on the first target space for testing generated by a first vehicle for testing and second object detection information for testing on the second target space for testing generated by a second vehicle for testing, comprising steps of: (a) on condition that (1) a learning device (100), if first object detection information for training on a first target space for training and second object detection information for training on a second target space for training have been acquired by processing a first original image for training on the first target space for training and a second original image for training on the second target space for training, has instructed a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors for training including information on one or more pairs for training of first original ROIs for training included in the first target space for training and second original ROIs for training in the second target space for training; (2) the learning device (100) has instructed a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors for training, to thereby generate (i) one or more determination vectors for training including information on probabilities for training of the first original ROIs for training and the second original ROIs for training included in each of the pairs for training being appropriate to be integrated and (ii) one or more box regression vectors for training including information on each of relative 3-Dimensional locations for training of integrated ROIs for training, corresponding to at least part of the pairs for training, comparing to each of original 3-Dimensional locations for training of each component of said at least part of the pairs for training, on an integrated target space for training; (3) the learning device (100) has instructed a loss unit (230) to generate an integrated loss by referring to the determination vectors for training, the box regression vectors for training and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN, a testing device installed on the first vehicle, if the first object detection information for testing on the first target space for testing and the second object detection information for testing on the second target space for testing are acquired by processing a first original image for testing on the first target space for testing and a second original image for testing on the second target space for testing, instructing the concatenating network (210) included in the DNN (200) to generate one or more pair feature vectors for testing including information on one or more pairs for testing of first original ROIs for testing included in the first target space for testing and second original ROIs for testing in the second target space for testing; (b) the testing device (100) instructing the determining network (220) included in the DNN to apply the FC operations to the pair feature vectors for testing, to thereby generate (i) one or more determination vectors for testing including information on probabilities for testing of the first original ROIs for testing and the second original ROIs for testing included in each of the pairs for testing being appropriate to be integrated and (ii) one or more box regression vectors for testing including information on each of relative 3-dimensional locations for testing of integrated ROIs for testing, corresponding to at least part of the pairs for testing, comparing to each of original 3-Dimensional locations for testing of each component of said at least part of the pairs for testing, on the integrated target space for testing; (c) the testing device (100) instructing a merging unit to generate the integrated object detection information for testing by merging at least part of the pairs for testing of first original bounding boxes for testing and second original bounding boxes for testing by referring to the determination vectors for testing and the box regression vectors for testing.
| 9. The testing method as claimed in claim 8, wherein the first original image for testing on the first target space for testing, acquired through at least part of one or more first cameras, one or more first lidars and one or more first radars installed on the first vehicle, is processed by a first neural network included in the first vehicle, to thereby generate the first object detection information for testing including (i) first class information for testing on objects for testing included in the first target space for testing, (ii) feature values for testing of the first original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the first original bounding boxes for testing and (iv) 3-dimensional coordinate values for testing of the first original ROIs for testing, and the second original image for testing on the second target space for testing, acquired through at least part of one or more second cameras, one or more second lidars and one or more second radars installed on the second vehicle, is processed by a second neural network included in the second vehicle, to thereby generate the second object detection information including (i) second class information for testing on objects for testing included in the second target space for testing, (ii) feature values for testing of the second original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the second original bounding boxes for testing and (iv) 3- dimensional coordinate values for testing of the second original ROIs for testing, and the second object detection information is delivered to the first vehicle through a V2V communication.
| 10. The testing method as claimed in claim 8, wherein, at the step of (a), a specific pair feature vector for testing, which is one of the pair feature vectors for testing, includes (i) first class information for testing of a first specific object for testing included in the first target space for testing, (ii) feature values for testing of a first specific original ROI for testing including the first specific object for testing, (iii) 3-dimensional coordinate values of a first specific original bounding box for testing corresponding to the first specific original ROI for testing, (iv) 3-dimensional coordinate values of the first specific original ROI for testing, (v) second class information for testing of a second specific object for testing included in the second target space for testing, (vi) feature values for testing of a second specific original ROI for testing including the second specific object for testing, (vii) 3-dimensional coordinate values of a second specific original bounding box for testing corresponding to the second specific original ROI for testing, and (viii) 3- dimensional coordinate values of the second specific original ROI for testing.
| 11. The testing method as claimed in claim 10, wherein, at the step of (b), a specific determination vector for testing, which is one of the determination vectors for testing and corresponds to the specific pair feature vector for testing, includes information on a probability of the first specific original ROI for testing and the second specific original ROI for testing being integrated on the integrated target space for testing, and a specific box regression vector for testing, which is one of the box regression vectors for testing and corresponds to the specific pair feature vector for testing, includes information on 3-dimensional coordinates of a specific integrated bounding box for testing generated by merging the first specific original ROI for testing and the second specific original ROI for testing on the integrated target space for testing.
| 12. A learning device (100) for generating integrated object detection information on an integrated target space including a first target space and a second target space, by integrating first object detection information on the first target space generated by a first vehicle and second object detection information on the second target space generated by a second vehicle, comprising: at least one memory (115) that stores instructions; and at least one processor (120) configured to execute the instructions to perform processes of: (I) if the first object detection information on the first target space and the second object detection information on the second target space are acquired by processing a first original image on the first target space and a second original image on the second target space, instructing a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors including information on one or more pairs of first original ROIs included in the first target space and second original ROIs in the second target space; (II) instructing a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors, to thereby generate (i) one or more determination vectors including information on probabilities of the first original ROIs and the second original ROIs included in each of the pairs being appropriate to be integrated and (ii) one or more box regression vectors including information on each of relative 3-dimensional locations of integrated ROIs, corresponding to at least part of the pairs, comparing to each of original 3- dimensional locations of each component of said at least part of the pairs, on the integrated target space; (III) instructing a loss unit (230) to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN.
| 13. The learning device as claimed in claim 12, wherein, at the process of (I), a specific pair feature vector, which is one of the pair feature vectors, includes (i) first class information of a first specific object included in the first target space, (ii) feature values of a first specific original ROI including the first specific object, (iii) 3-dimensional coordinate values of a first specific original bounding box corresponding to the first specific original ROI, (iv) 3- dimensional coordinate values of the first specific original ROI, (v) second class information of a second specific object included in the second target space, (vi) feature values of a second specific original ROI including the second specific object, and (vii) 3-dimensional coordinate values of a second specific original bounding box corresponding to the second specific original ROI, and (viii) 3-dimensional coordinate values of the second specific original ROI.
| 14. The learning device as claimed in claim 13, wherein, at the process of (II), a specific determination vector, which is one of the determination vectors and corresponds to the specific pair feature vector, includes information on a probability of the first specific original ROI and the second specific original ROI being integrated on the integrated target space, and a specific box regression vector, which is one of the box regression vectors and corresponds to the specific pair feature vector, includes information on 3- dimensional coordinates of a specific integrated bounding box generated by merging the first specific original ROI and the second specific original ROI on the integrated target space.
| 15. The learning device as claimed in claim 12, wherein, at the process of (III), the processor instructs the loss unit (i) to generate a determination loss by using at least part of the determination vectors through a cross entropy device, (ii) to generate a box regression loss by using at least part of the box regression vectors through a smooth-L1 device, and (iii) to generate the integrated loss by referring to the determination loss and the box regression loss.
| 16. The learning device as claimed in claim 15, wherein, at the process of (III), the determination loss is generated by a formula below: Equation wherein n denotes the number of the determination vectors, vc i denotes an i-th determination vector, vC-GT i denotes an i-th determination GT vector on the i-th determination vector, and the box regression loss is generated by a formula below: Equation wherein n denotes the number of the box regression vectors, vc i denotes an i-th box regression vector, vC-GT i denotes an i-th box regression GT vector on the i-th box regression vector.
| 17. The learning device as claimed in claim 12, wherein the processor instructs each of deep learning neurons included in one or more layers of the DNN to repeatedly apply one or more convolutional operations to its input by using its own at least one parameter and deliver its output to its next deep learning neuron, to thereby generate the pair feature vectors, the determination vectors and the box regression vectors.
| 18. The learning device as claimed in claim 12, wherein, at the process of (II), the processor instructs the determining network included in the DNN to generate the determination vectors by applying at least part of the FC operations to the pair feature vectors, and to generate the one or more box regression vectors corresponding to one or more specific pair feature vectors, among the pair feature vectors, whose values in corresponding specific determination vectors denoting specific probabilities of specific pairs to be integrated are larger than a prescribed threshold, by applying the other part of the FC operations to the specific pair feature vectors.
| 19. A testing device (100) for generating integrated object detection information for testing on an integrated target space for testing including a first target space for testing and a second target space for testing, by integrating first object detection information for testing on the first target space for testing generated by a first vehicle for testing and second object detection information for testing on the second target space for testing generated by a second vehicle for testing, comprising: at least one memory (115) that stores instructions; and at least one processor (120) configured to execute the instructions to perform processes of: (I) on condition that (1) a learning device (100), if first object detection information for training on a first target space for training and second object detection information for training on a second target space for training have been acquired by processing a first original image for training on the first target space for training and a second original image for training on the second target space for training, has instructed a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors for training including information on one or more pairs for training of first original ROIs for training included in the first target space for training and second original ROIs for training in the second target space for training; (2) the learning device (100) has instructed a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors for training, to thereby generate (i) one or more determination vectors for training including information on probabilities for training of the first original ROIs for training and the second original ROIs for training included in each of the pairs for training being appropriate to be integrated and (ii) one or more box regression vectors for training including information on each of relative 3-Dimensional locations for training of integrated ROIs for training, corresponding to at least part of the pairs for training, comparing to each of original 3-Dimensional locations for training of each component of said at least part of the pairs for training, on an integrated target space for training; (3) the learning device (100) has instructed a loss unit (230) to generate an integrated loss by referring to the determination vectors for training, the box regression vectors for training and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN, if the first object detection information for testing on the first target space for testing and the second object detection information for testing on the second target space for testing are acquired by processing a first original image for testing on the first target space for testing and a second original image for testing on the second target space for testing, instructing the concatenating network (210) included in the DNN (200) to generate one or more pair feature vectors for testing including information on one or more pairs for testing of first original ROIs for testing included in the first target space for testing and second original ROIs for testing in the second target space for testing; (II) instructing the determining network (220) included in the DNN to apply the FC operations to the pair feature vectors for testing, to thereby generate (i) one or more determination vectors for testing including information on probabilities for testing of the first original ROIs for testing and the second original ROIs for testing included in each of the pairs for testing being appropriate to be integrated and (ii) one or more box regression vectors for testing including information on each of relative 3-dimensional locations for testing of integrated ROIs for testing, corresponding to at least part of the pairs for testing, comparing to each of original 3-Dimensional locations for testing of each component of said at least part of the pairs for testing, on the integrated target space for testing; (III) instructing a merging unit to generate the integrated object detection information for testing by merging at least part of the pairs for testing of first original bounding boxes for testing and second original bounding boxes for testing by referring to the determination vectors for testing and the box regression vectors for testing.
| 20. The testing device as claimed in claim 19, wherein the first original image for testing on the first target space for testing, acquired through at least part of one or more first cameras, one or more first lidars and one or more first radars installed on the first vehicle, is processed by a first neural network included in the first vehicle, to thereby generate the first object detection information for testing including (i) first class information on objects for testing included in the first target space for testing, (ii) feature values for testing of the first original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the first original bounding boxes for testing and (iv) 3- dimensional coordinate values for testing of the first original ROIs for testing, and the second original image for testing on the second target space for testing, acquired through at least part of one or more second cameras, one or more second lidars and one or more second radars installed on the second vehicle, is processed by a second neural network included in the second vehicle, to thereby generate the second object detection information including (i) second class information on objects for testing included in the second target space for testing, (ii) feature values for testing of the second original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the second original bounding boxes for testing and (iv) 3-dimensional coordinate values for testing of the second original ROIs for testing, and the second object detection information is delivered to the first vehicle through a V2V communication.
| 21. The testing device as claimed in claim 19, wherein, at the process of (I), a specific pair feature vector for testing, which is one of the pair feature vectors for testing, includes (i) first class information for testing of a first specific object for testing included in the first target space for testing, (ii) feature values for testing of a first specific original ROI for testing including the first specific object for testing, (iii) 3-dimensional coordinate values of a first specific original bounding box for testing corresponding to the first specific original ROI for testing, (iv) 3-dimensional coordinate values of the first specific original ROI for testing, (v) second class information for testing of a second specific object for testing included in the second target space for testing, (vi) feature values for testing of a second specific original ROI for testing including the second specific object for testing, (vii) 3-dimensional coordinate values of a second specific original bounding box for testing corresponding to the second specific original ROI for testing, and (viii) 3- dimensional coordinate values of the second specific original ROI for testing.
| 22. The testing device as claimed in claim 21, wherein, at the process of (II), a specific determination vector for testing, which is one of the determination vectors for testing and corresponds to the specific pair feature vector for testing, includes information on a probability of the first specific original ROI for testing and the second specific original ROI for testing being integrated on the integrated target space for testing, and a specific box regression vector for testing, which is one of the box regression vectors for testing and corresponds to the specific pair feature vector for testing, includes information on 3-dimensional coordinates of a specific integrated bounding box for testing generated by merging the first specific original ROI for testing and the second specific original ROI for testing on the integrated target space for testing. | The learning method involves instructing a concatenating network (210) included in a determining network (DNN) to generate the pair feature vectors including information on the pairs of first original relative locations of integrated (ROIs) included in the first target space and second original ROIs in the second target space. The learning device instructing a determining network (220) included in the DNN to apply an operations to the pair feature vectors. The learning device instructs a loss unit (230) to generate an integrated loss by referring to the determination vectors. The box regression vectors performing a backpropagation processes by using the integrated loss to learn a unit of parameters included in the DNN. INDEPENDENT CLAIMS are included for the following:a testing method for generating integrated object detection information for testing on integrated target space; anda learning device for generating integrated object detection information on an integrated target space. Learning method for generating integrated object detection information on integrated target space using learning device (claimed). The original images are integrated and the results of object detection on the integrated target space are generated without additional operations on the integrated image. The safety of the autonomous vehicles using the integrated space detection result is improved. The drawing shows a schematic view of a learning device performing a learning process for integrating object detection information. 210Concatenating network220Determining network230Loss unit |
Please summarize the input | METHOD FOR PROVIDING ROBUST OBJECT DISTANCE ESTIMATION BASED ON CAMERA BY PERFORMING PITCH CALIBRATION OF CAMERA MORE PRECISELY WITH FUSION OF INFORMATION ACQUIRED THROUGH CAMERA AND INFORMATION ACQUIRED THROUGH V2V COMMUNICATION AND DEVICE USING THE SAMEA method for enhancing an accuracy of object distance estimation based on a subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through V2V communication is provided. And the method includes steps of: (a) a computing device, performing (i) a process of instructing an initial pitch calibration module to apply a pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, and (ii) a process of instructing an object detection network to apply a neural network operation to the reference image, to thereby generate reference object detection information; (b) the computing device instructing an adjusting pitch calibration module to (i) select a target object, (ii) calculate an estimated target height of the target object, (iii) calculate an error corresponding to the initial estimated pitch, and (iv) determine an adjusted estimated pitch on the subject camera by using the error.|1. A method for enhancing an accuracy of object distance estimation based on at least one subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through Vehicle-to-Vehicle (V2V) communication, comprising steps of: (a) a computing device [100], if at least one reference image is acquired through the subject camera, performing (i) a process of instructing an initial pitch calibration module [140] to apply at least one pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, which is a value generated by estimating an angle between an optical axis of the subject camera and a ground, and (ii) a process of instructing an object detection network [170] to apply at least one neural network operation to the reference image, to thereby generate reference object detection information on one or more reference objects in the reference image; (b) the computing device [100] instructing an adjusting pitch calibration module [150] to (i) select at least one target object among the reference objects, (ii) calculate at least one estimated target height of the target object by referring to the initial estimated pitch and at least one relative location of the target object from a subject autonomous vehicle including the subject camera, (iii) calculate at least one error corresponding to the initial estimated pitch by referring to the estimated target height and at least one Ground-Truth (GT) target height acquired through V2V communication, and (iv) determine at least one adjusted estimated pitch on the subject camera by using the error; (c) the computing device [100] instructing the object detection network [170] and a distance calculation module [160] to generate autonomous driving information including information on distances, calculated by referring to the adjusted estimated pitch, between the subject autonomous vehicle and surrounding objects included in an autonomous driving image, wherein the computing device [100] instructs the adjusting pitch calibration module [150] to select one or more specific reference objects, among the reference objects, which satisfy a first condition on whether each of the reference objects has a specific class corresponding to a communicability or not, and to select said at least one target object, among the specific reference objects, which satisfies at least one of (i) a second condition on whether each of specific reference bounding boxes including each of the specific reference objects is located in at least one illustration window area of the reference image or not and (ii) a third condition on whether an aspect ratio of each of the specific reference bounding box is smaller than an estimation threshold value or not.
| 2. The method of Claim 1, wherein, before the step of (b), the computing device [100] instructs the distance calculation module [160] to map each of one or more reference location base points, which are points in each of lower sides of each of reference bounding boxes including each of the reference objects, onto a space coordinate system corresponding to a virtual space including the subject autonomous vehicle by referring to the initial estimated pitch, the reference object detection information and the reference image, to thereby calculate one or more longitudinal floor distances and one or more lateral floor distances between the reference objects and the subject autonomous vehicle, and then to thereby generate each of reference relative coordinates including each of the longitudinal floor distances and the lateral floor distances as its components.
| 3. The method of Claim 2, wherein, before the step of (b), the computing device [100] instructs the distance calculation module [160] to map the reference location base points onto the space coordinate system.
| 4. The method of Claim 2, wherein, before the step of (b), the computing device [100] instructs a V2V communication module [130] to communicate with one or more communicable objects located closer than a threshold distance from the subject autonomous vehicle, to thereby acquire information on one or more communicable object classes, one or more communicable object GT heights, and one or more communicable object coordinates, and instructs the adjusting pitch calibration module [150] to select specific reference objects, among the reference objects, which have a specific class corresponding to a communicability, and pair at least part of the communicable object GT heights with at least part of the specific reference objects by referring to (i) communicable object relative coordinates in relation to the subject autonomous vehicle, calculated by using the communicable object coordinates and (ii) specific reference relative coordinates on the specific reference objects, to thereby acquire specific reference GT heights on the specific reference objects.
| 5. The method of Claim 1, wherein, at the step of (b), the computing device [100], if the target object is selected, instructs the adjusting pitch calibration module [150] to select a target relative coordinate corresponding to the target object, among the reference relative coordinates, and to calculate the estimated target height by performing a height estimating operation by referring to the initial estimated pitch.
| 6. The method of Claim 1, wherein, at the step of (b), the computing device [100], in case the number of the target object is 1, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, and (ii-1) to adjust the initial estimated pitch to be decreased by a prescribed adjustment ratio if the estimated target height is included in the overestimated range, or (ii-2) to adjust the initial estimated pitch to be increased by the prescribed adjustment ratio if the estimated target height is included in the underestimated range.
| 7. The method of Claim 1, wherein, at the step of (b), the computing device [100], in case the number of the target object is larger than or same as 2, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, (ii) to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and an underestimated error ratio corresponding to the underestimated range, and (iii) adjust the initial estimated pitch by referring to said information.
| 8. The method of Claim 1, wherein, at the step of (a), the computing device [100] instructs the initial pitch calibration module [140] to generate the initial estimated pitch by applying the pitch calculation operation to the reference image.
| 9. The method of Claim 1, wherein, at the step of (a), the computing device [100] (i) instructs a convolutional layer of the object detection network [170] to generate at least one reference convolutional feature map by applying at least one convolutional operation, which is a part of the neural network operation, to the reference image, (ii) instructs an ROI pooling layer of the object detection network [170] to apply at least one pooling operation, which is a part of the neural network operation, in order to pool values, corresponding to ROIs of the reference image, from the reference convolutional feature map, to thereby generate at least one reference ROI25 Pooled feature map, and (iii) instructs an FC layer of the object detection network [170] to apply at least one FC operation, which is a part of the neural network operation, to the reference ROI-Pooled feature map, to thereby generate the reference object detection information including information on reference classes of the reference objects and reference bounding boxes including the reference objects.
| 10. A computing device [100] for enhancing an accuracy of object distance estimation based on at least one subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through Vehicle-to-Vehicle (V2V) communication, comprising: at least one memory [115] that stores instructions; and at least one processor [110] configured to execute the instructions to perform processes of: (I) if at least one reference image is acquired through the subject camera, performing (i) a process of instructing an initial pitch calibration module [140] to apply at least one pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, which is a value generated by estimating an angle between an optical axis of the subject camera and a ground, and (ii) a process of instructing an object detection network [170] to apply at least one neural network operation to the reference image, to thereby generate reference object detection information on one or more reference objects in the reference image; (II) instructing an adjusting pitch calibration module [150] to (i) select at least one target object among the reference objects, (ii) calculate at least one estimated target height of the target object by referring to the initial estimated pitch and at least one relative location of the target object from a subject autonomous vehicle including the subject camera, (iii) calculate at least one error corresponding to the initial estimated pitch by referring to the estimated target height and at least one Ground-Truth (GT) target height acquired beforehand, and (iv) determine at least one adjusted estimated pitch on the subject camera by using the error; (III) the computing device [100] instructing the object detection network [170] and a distance calculation module [160] to generate autonomous driving information including information on distances, calculated by referring to the adjusted estimated pitch, between the subject autonomous vehicle and surrounding objects included in an autonomous driving image, wherein the computing device [100] instructs the adjusting pitch calibration module [150] to select one or more specific reference objects, among the reference objects, which satisfy a first condition on whether each of the reference objects has a specific class corresponding to a communicability or not, and to select said at least one target object, among the specific reference objects, which satisfies at least one of (i) a second condition on whether each of specific reference bounding boxes including each of the specific reference objects is located in at least one illustration window area of the reference image or not and (ii) a third condition on whether an aspect ratio of each of the specific reference bounding box is smaller than an estimation threshold value or not.
| 11. The device of Claim 10, wherein, before the process of (II), the processor [110] instructs a distance calculation module [160] to map each of one or more reference location base points, which are points in each of lower sides of each of reference bounding boxes including each of the reference objects, onto a space coordinate system corresponding to a virtual space including the subject autonomous vehicle by referring to the initial estimated pitch, the reference object detection information, and the reference image, to thereby calculate one or more longitudinal floor distances and one or more lateral floor distances between the reference objects and the subject autonomous vehicle, and then to thereby generate each of reference relative coordinates including each of the longitudinal floor distances and the lateral floor distances as its components.
| 12. The device of Claim 11, wherein, before the process of (II), the processor [110] instructs the distance calculation module [160] to map the reference location base points onto the space coordinate system.
| 13. The device of Claim 11, wherein, before the process of (II), the processor [110] instructs a V2V communication module [130] to communicate with one or more communicable objects located closer than a threshold distance from the subject autonomous vehicle, to thereby acquire information on one or more communicable object classes, one or more communicable object GT heights, and one or more communicable object coordinates, and instructs the adjusting pitch calibration module [150] to select specific reference objects, among the reference objects, which have a specific class corresponding to a communicability, and pair at least part of the communicable object GT heights with at least part of the specific reference objects by referring to (i) communicable object relative coordinates in relation to the subject autonomous vehicle, calculated by using the communicable object coordinates and (ii) specific reference relative coordinates on the specific reference objects, to thereby acquire specific reference GT heights on the specific reference objects.
| 14. The device of Claim 10, wherein, at the process of (II), the processor [110], if the target object is selected, instructs the adjusting pitch calibration module [150] to select a target relative coordinate corresponding to the target object, among the reference relative coordinates, and to calculate the estimated target height by performing a height estimating operation by referring to the initial estimated pitch.
| 15. The device of Claim 10, wherein, at the process of (II), the processor [110], in case the number of the target object is 1, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, and (ii-1) to adjust the initial estimated pitch to be decreased by a prescribed adjustment ratio if the estimated target height is included in the overestimated range, or (ii-2) to adjust the initial estimated pitch to be increased by the prescribed adjustment ratio if the estimated target height is included in the underestimated range.
| 16. The device of Claim 10, wherein, at the process of (II), the processor [110], in case the number of the target object is larger than or same as 2, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, (ii) to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and an underestimated error ratio corresponding to the underestimated range, and (iii) adjust the initial estimated pitch by referring to said information.
| 17. The device of Claim 10, wherein, at the process of (I), the processor [110] instructs the initial pitch calibration module [140] to generate the initial estimated pitch by applying the pitch calculation operation to the reference image.
| 18. The device of Claim 10, wherein, at the process of (I), the processor [110] (i) instructs a convolutional layer of the object detection network [170] to generate at least one reference convolutional feature map by applying at least one convolutional operation, which is a part of the neural network operation, to the reference image, (ii) instructs an ROI pooling layer of the object detection network [170] to apply at least one pooling operation, which is a part of the neural network operation, in order to pool values, corresponding to ROIs of the reference image, from the reference convolutional feature map, to thereby generate at least one reference ROIPooled feature map, and (iii) instructs an FC layer of the object detection network [170] to apply at least one FC operation, which is a part of the neural network operation, to the reference ROI-Pooled feature map, to thereby generate the reference object detection information including information on reference classes of the reference objects and reference bounding boxes including the reference objects. | The method involves instructing an initial pitch calibration module (140) to apply a pitch calculation operation to the reference image. An initial estimated pitch is generated. A value is generated by estimating an angle between an optical axis of the subject camera and a ground. An object detection network (170) is instructed to apply a neural network operation to the reference image. The reference object detection information is generated on multiple reference objects in the reference image. A target object is selected among the reference objects. An estimated target height of the target object is calculated by referring to the initial estimated pitch. A relative location of the target object is provided from a subject autonomous vehicle including the subject camera. An error corresponding to the initial estimated pitch is calculated by referring to the estimated target height. A ground-truth (GT) target height is acquired. An INDEPENDENT CLAIM is included for a computing device for providing an object distance estimation based on a camera for communicating with a vehicle. Method for providing an object distance estimation based on a camera for communicating with an autonomous vehicle, such as car. The accuracy of object distance estimation is improved, even when the quality of the input image acquired through the camera is not good. The pitch calibration is performed more precisely with the additional information acquired through the vehicle-to-vehicle communication. The drawing shows a block diagram of computing device for providing an object distance estimation based on a camera for communicating with a vehicle. 115Memory140Initial pitch calibration module150Adjusting pitch calibration module160Distance calculation module170Object detection network |
Please summarize the input | METHOD AND DEVICE FOR PERFORMING MULTIPLE AGENT SENSOR FUSION IN COOPERATIVE DRIVING BASED ON REINFORCEMENT LEARNINGA method for learning a sensor fusion network for sensor fusion of an autonomous vehicle performing a cooperative driving is provided. The method includes steps of: a learning device, (a) inputting (i) a driving image including the autonomous vehicle, m cooperatively-driving vehicles, and second virtual vehicles and (ii) sensor status information on n sensors in the m cooperatively-driving vehicles into the sensor fusion network, to generate sensor fusion probabilities of sensor values of the n sensors being transmitted and generate fusion sensor information on s sensors having large probabilities, (b) inputting a road-driving video into a detection network, to detect the second virtual vehicles, pedestrians, and lanes and output nearby object information, and inputting sensor values and the nearby object information into a drive network, to generate moving direction probabilities and drive the autonomous vehicle and (c) acquiring traffic condition information, generating a reward, and learning the sensor fusion network.|1. A method for learning a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising steps of: (a) if (i) a driving image for training including (i-1) a subject autonomous vehicle, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle, and (i-3) second virtual vehicles performing a non-cooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, a learning device (100) performing a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (a-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (a-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn; (b) the learning device (100) performing a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle and thus to output nearby object information for training, and a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle by referring to the moving direction probabilities for training; and (c) the learning device (100) performing a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward.
| 2. The method as claimed in 1, wherein, at the step of (a), the learning device (100) performs a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for training by applying convolution operation using a Convolutional Neural Network (CNN) to the driving image for training and generate an image feature vector for training by applying fully-connected operation to the feature map for training, (ii) generate a sensor status feature vector for training by applying recurrent neural network operation using at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for training, and (iii) generate a concatenated feature vector for training by concatenating the image feature vector for training and the sensor status feature vector for training and generate the sensor fusion probabilities for training by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for training.
| 3. The method as claimed in 2, wherein the learning device (100) updates at least one parameter of the CNN, the at least one LSTM, and the at least one fully connected layer which are included in the sensor fusion network (140), by using the reward.
| 4. The method as claimed in 2, wherein the learning device (100) instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for training and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for training.
| 5. The method as claimed in 2, wherein the learning device (100) normalizes and outputs each of the sensor fusion probabilities for training using a softmax algorithm.
| 6. The method as claimed in 1, wherein the reward is generated by subtracting the number of the s sensors for training from a sum of the number of the n sensors for training in each of the m cooperatively-driving vehicles for training, and wherein the learning device (100) increases or decreases the reward by referring to the traffic condition information for training.
| 7. The method as claimed in 1, wherein the driving image for training is an entire road image of an entire road on which the m cooperatively-driving vehicles for training is in the cooperative driving, and is an image with m+1 channels which represents whether each of blocks of a certain size, into which the entire road image is divided as a grid, is occupied by said each of the m cooperatively driving vehicles for training or by all of the second virtual vehicles, and wherein each of m channels among said m+1 channels corresponds to said each of the m cooperatively-driving vehicles for training, and a remaining channel among said m+1 channels corresponds to the second virtual vehicles.
| 8. A method for testing a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising steps of: (a) on condition that a learning device (100) has performed, if (i) a driving image for training including (i-1) a subject autonomous vehicle for training, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle for training, and (i-3) second virtual vehicles performing a noncooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, (1) a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (1-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (1-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (2) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle for training and thus to output nearby object information for training, (3) a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle for training by referring to the moving direction probabilities for training, and (4) a process of acquiring traffic condition information for training on the subject autonomous vehicle for training driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward, if (i) a driving image for testing including (i-1) a subject autonomous vehicle for testing, (i-2) k cooperatively-driving vehicles for testing having first vehicles performing the cooperative driving with the subject autonomous vehicle for testing, and (i-3) second vehicles performing the non-cooperative driving, in an actual driving environment, and (ii) multiple pieces of sensor status information for testing on i sensors for testing in each of the k cooperatively-driving vehicles for testing are acquired, a testing device (200) of at least one of the k cooperatively-driving vehicles for testing performing a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (a-1) to generate sensor fusion probabilities for testing which are probabilities of said each of the k cooperatively-driving vehicles for testing transmitting each of sensor values of each of the i sensors for testing over the V2V communication for the cooperative driving, by applying its neural network operation to the driving image for testing and the multiple pieces of the sensor status information for testing (a-2) to generate fusion sensor information for testing on s sensors for testing having probabilities larger than a predetermined threshold among the sensor fusion probabilities for testing, and (a-3) to transmit the fusion sensor information for testing on the s sensors for testing to at least part of the k cooperatively-driving vehicles for testing over the V2V communication; and (b) the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performing a process of inputting a road-driving video for testing acquired over the V2V communication in response to the fusion sensor information for testing into the detection network (150), to thereby allow the detection network (150) to detect at least part of the second vehicles, the pedestrians, and the lanes on a driving road of the subject autonomous vehicle for testing and thus to output nearby object information for testing, and a process of inputting both sensor values for testing, acquired over the V2V communication in response to the fusion sensor information for testing, and the nearby object information for testing into the drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for testing of said each of the k cooperatively driving vehicles for testing by referring to the sensor values for testing and the nearby object information for testing, and thus to drive the subject autonomous vehicle for testing by referring to the moving direction probabilities for testing.
| 9. The method as claimed in 8, wherein, at the step of (a), the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performs a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for testing by applying convolution operation of a Convolutional Neural Network (CNN) to the driving image for testing and generate an image feature vector for testing by applying fully-connected operation to the feature map for testing, (ii) generate a sensor status feature vector for testing by applying recurrent neural network operation of at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for testing, and (iii) generate a concatenated feature vector for testing by concatenating the image feature vector for testing and the sensor status feature vector for testing and generate the sensor fusion probabilities for testing by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for testing.
| 10. The method as claimed in 9, wherein the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for testing and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for testing.
| 11. The method as claimed in 9, wherein the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing normalizes and outputs each of the sensor fusion probabilities for testing using a softmax algorithm.
| 12. The method as claimed in 8, wherein, at the step of (a), the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performs (i) a process of generating a feature map for testing by applying multiple convolution operation using a specific CNN to the driving image for testing, acquired from a specific cooperatively-driving vehicle among the k cooperatively-driving vehicles for testing, and if an image feature vector for testing is generated by applying fully-connected operation to the feature map for testing, a process of acquiring the image feature vector for testing from the specific cooperatively-driving vehicle over the V2V communication, (ii) a process of generating a sensor status feature vector for testing by applying recurrent neural network operation using at least one LSTM to the multiple pieces of the sensor status information for testing, and (iii) a process of generating a concatenated feature vector for testing by concatenating the image feature vector for testing and the sensor status feature vector for testing acquired over the V2V communication and a process of generating the sensor fusion probabilities for testing by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for testing.
| 13. The method as claimed in 12, wherein the specific vehicle allows a specific CNN to apply convolution operation to the driving image for testing to thereby generate a feature map for testing, and to apply fully-connected operation to the feature map for testing to thereby generate the image feature vector for testing, and wherein the specific vehicle is one of the k cooperatively-driving vehicles for testing which is designated sequentially at stated intervals according to a round-robin schedule.
| 14. A learning device (100) for learning a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising: at least one memory (120) that stores instructions; and at least one processor (130) configured to execute the instructions to perform or support another device to perform: (I) if (i) a driving image for training including (i-1) a subject autonomous vehicle, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle, and (i-3) second virtual vehicles performing a non-cooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively driving vehicles for training are acquired, a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (I-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (I-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (II) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle and thus to output nearby object information for training, and a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle by referring to the moving direction probabilities for training, and (III) a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward.
| 15. The learning device (100) as claimed in 14, wherein, at the process of (I), the processor (130) performs a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for training by applying convolution operation using a Convolutional Neural Network (CNN) to the driving image for training and generate an image feature vector for training by applying fully-connected operation to the feature map for training, (ii) generate a sensor status feature vector for training by applying recurrent neural network operation using at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for training, and (iii) generate a concatenated feature vector for training by concatenating the image feature vector for training and the sensor status feature vector for training and generate the sensor fusion probabilities for training by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for training.
| 16. The learning device (100) as claimed in 15, wherein the processor (130) updates at least one parameter of the CNN, the at least one LSTM, and the at least one fully connected layer which are included in the sensor fusion network (140), by using the reward.
| 17. The learning device (100) as claimed in 15, wherein the processor (130) instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for training and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for training.
| 18. The learning device (100) as claimed in 15, wherein the processor (130) normalizes and outputs each of the sensor fusion probabilities for training using a softmax algorithm.
| 19. The learning device (100) as claimed in 14, wherein the reward is generated by subtracting the number of the s sensors for training from a sum of the number of the n sensors for training in each of the m cooperatively-driving vehicles for training, and wherein the processor (130) increases or decreases the reward by referring to the traffic condition information for training.
| 20. The learning device (100) as claimed in 14, wherein the driving image for training is an entire road image of an entire road on which the m cooperatively-driving vehicles for training is in the cooperative driving, and is an image with m+1 channels which represents whether each of blocks of a certain size, into which the entire road image is divided as a grid, is occupied by said each of the m cooperatively-driving vehicles for training or by all of the second virtual vehicles, and wherein each of m channels among said m+1 channels corresponds to said each of the m cooperatively-driving vehicles for training, and a remaining channel among said m+1 channels corresponds to the second virtual vehicles.
| 21. A testing device (200) of at least one of k cooperatively-driving vehicles for testing, to be used for testing a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising: at least one memory (220) that stores instructions; and at least one processor (230) configured to execute the instructions to perform or support another device to perform: (I) on condition that a learning device (100) has performed, if (i) a driving image for training including (i-1) a subject autonomous vehicle for training, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle for training, and (i-3) second virtual vehicles performing a noncooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, (1) a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (1-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (1-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (2) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle for training and thus to output nearby object information for training, (3) a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle for training by referring to the moving direction probabilities for training, and (4) a process of acquiring traffic condition information for training on the subject autonomous vehicle for training driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward, if (i) a driving image for testing including (i-1) a subject autonomous vehicle for testing, (i-2) the k cooperativelydriving vehicles for testing having first vehicles performing the cooperative driving with the subject autonomous vehicle for testing, and (i-3) second vehicles performing the noncooperative driving, in an actual driving environment, and (ii) multiple pieces of sensor status information for testing on i sensors for testing in each of the k cooperatively-driving vehicles for testing are acquired, a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (I-1) to generate sensor fusion probabilities for testing which are probabilities of said each of the k cooperatively-driving vehicles for testing transmitting each of sensor values of each of the i sensors for testing over the V2V communication for the cooperative driving, by applying its neural network operation to the driving image for testing and the multiple pieces of the sensor status information for testing (I2) to generate fusion sensor information for testing on s sensors for testing having probabilities larger than a predetermined threshold among the sensor fusion probabilities for testing, and (I3) to transmit ... | The method involves training a driving image which includes a subject autonomous vehicle, the cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle. The second virtual vehicles performing a non-cooperative driving. The learning device (100) is performing a process of inputting a road-driving video for training acquired over the Vehicle-to-vehicle (V2V) communication in response to the fusion sensor information for training into a detection network (150). The learning device performing a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160). A process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward. INDEPENDENT CLAIMS are included for the following:a learning device for learning a sensor fusion network to be used for sensor fusion of an autonomous vehicle performing a cooperative driving; anda testing device of one of cooperatively-driving vehicles for testing. Method for learning a sensor fusion network to be used for sensor fusion of an autonomous vehicle performing a cooperative driving. Utilize the sensor information from other autonomous vehicles on the cooperative driving mode to allow a functional safety, and reliability of the autonomous vehicles may be updated by training from virtual driving. The vehicle reduce its speed if the nearby vehicle is detected as having a possibility of collision with the subject autonomous vehicle within a second threshold time. The drawing shows a block diagram of a learning device for learning a sensor fusion network for sensor fusion of a subject autonomous vehicle. 100Learning device130Processor140Sensor fusion network150Detection network160Drive network |
Please summarize the input | METHOD AND DEVICE FOR EGO-VEHICLE LOCALIZATION TO UPDATE HD MAP BY USING V2X INFORMATION FUSIONA method for calculating exact location of a subject vehicle by using information on relative distances is provided. And the method includes steps of: (a) a computing device, if a reference image is acquired through a camera on the subject vehicle, detecting reference objects in the reference image; (b) the computing device calculating image-based reference distances between the reference objects and the subject vehicle, by referring to information on reference bounding boxes, corresponding to the reference objects, on the reference image; (c) the computing device (i) generating a distance error value by referring to the image-based reference distances and coordinate-based reference distances, and (ii) calibrating subject location information of the subject vehicle by referring to the distance error value.|1. A computer implemented method for calculating a location of a subject vehicle, the method comprising the steps of:
* (a) detecting one or more reference objects in a reference image acquired through a camera on the subject vehicle, by applying at least one object detection operation to the reference image;
* (b) calculating each of one or more image-based reference distances (1-..) between each of the reference objects and the subject vehicle, by referring to information on each of reference bounding boxes, corresponding to each of the reference objects, on the reference image; and
* (c) (i) acquiring one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject-vehicle by using information on a position of the subject vehicle acquired through a global positioning system, GPS, and information on locations of the reference objects acquired via vehicle-to-everything, V2X communications, or from a database, and generating at least one distance error value by referring to the image-based reference distances (1-..) and the one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject vehicle, and (ii) calibrating subject location information of the subject vehicle by calculating an optimized location of the subject vehicle by minimizing the distance error value,
* wherein the step of (b) further comprises calculating the image-based reference distances by applying at least one image distance estimation operation, which uses information on parameters of the camera, to the reference bounding boxes, and
* wherein (i) one or more (1-1)-st specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-1)-st specific image distance estimation operation, which further uses (1-1)-st specific reference object height information acquired through V2X communications between the subject vehicle and one or more (1-1)-st specific reference objects among the reference objects, to (1-1)-st specific reference bounding boxes, and (ii) one or more (1-2)-nd specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-2)-nd specific image distance estimation operation, which further uses information on a reference point in the reference image, to (1-2)-nd specific reference bounding boxes,
* wherein the (1-1)-st specific image distance estimation operation is performed by using the following formula: D=f×VHh
* wherein D denotes one of the (1-1)-st specific image-based reference distances, f denotes a focal length of the camera, VH denotes a piece of the (1-1)-st specific reference object height information corresponding to one of the (1-1)-st specific reference objects, and h denotes an apparent height of said one of the (1-1)-st specific reference objects on the reference image,
* wherein the (1-2)-nd specific image distance estimation operation is performed by using the following formula: D=f×Hb?cy
* wherein D denotes one of the (1-2)-nd specific image-based reference distances, f denotes a focal length of the camera, H denotes a height of the camera, and |b - cy| denotes an apparent distance, on the reference image, between the reference point and a lower boundary of one of the (1-2)-nd specific reference bounding boxes corresponding to one of (1-2)-nd specific reference objects, wherein the reference point is a center point of the reference image.
| 2. The method of Claim 1, wherein the step of (c) further comprises generating the distance error value by using the following formula: ErrorD=?k=1NwkDik?Dck2 wherein Dik denotes one of the image-based reference distances corresponding to a k-th reference object among the reference objects, Dck denotes one of the coordinate-based reference distances corresponding to the k-th reference object, wk denotes a weight for the k-th object, and N denotes the number of the reference objects.
| 3. The method of Claim 1, wherein (i) one or more (2-1)-st specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-1)-st specific reference object location information acquired through V2X communications between the subject vehicle and one or more (2-1)-st specific reference objects among the reference objects, and (ii) one or more (2-2)-nd specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-2)-nd specific reference object location information which has been acquired from a database.
| 4. The method of Claim 1, wherein the step of (a) further comprises applying the object detection operation to the reference image by instructing a Convolutional Neural Network, CNN, to apply at least one convolutional operation, at least one Region-Of-Interest, ROI, pooling operation and at least one Fully-Connected, FC, network operation to the reference image, in that order.
| 5. The method of Claim 1, wherein the step of (c) further comprises calibrating the subject location information to generate calibrated subject location information which makes its corresponding distance error value smallest, by repeating processes of (i) adjusting the subject location information, (ii) re-calculating the distance error value by using the adjusted subject location information, and (iii) re-adjusting the adjusted subject location information by referring to information on whether the re-calculated distance error value has become smaller or not.
| 6. The method of Claim 1, wherein the step of (c) further comprises, while storing update information to be used for updating a High-Definition Map(HD Map), recording information on locations where the update information has been acquired, by referring to the calibrated subject location information.
| 7. The method of Claim 1, wherein the reference objects include at least part of (i) one or more mobile objects capable of a V2X communication and (ii) one or more fixed objects, (ii-1) which are capable of the V2X communication or (ii-2) whose information is stored in a database, and
wherein the reference objects include at least part of one or more traffic signs, one or more traffic lights, one or more road markings and one or more surrounding vehicles located closer than a threshold from the subject vehicle.
| 8. The method of Claim 1, wherein the step of (c) further comprises calculating a location, on a High-Definition, HD, map, of the subject vehicle by referring to the calibrated subject location information, and acquiring information, from the HD map, on one or more objects located closer than a threshold from the subject vehicle, to thereby support an autonomous driving of the subject vehicle.
| 9. A device (100) for calculating a location of a subject vehicle, the device comprising:
* at least one memory (115) that stores instructions; and
* at least one processor (120) configured to execute the instructions to perform processes of: (I) detecting one or more reference objects in a reference image acquired through a camera on the subject vehicle, by applying at least one object detection operation to the reference image; (II) calculating each of one or more image-based reference distances (1-..) between each of the reference objects and the subject vehicle, by referring to information on each of reference bounding boxes, corresponding to each of the reference objects, on the reference image; and (III) (i) acquiring one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject-vehicle by using information on a position of the subject vehicle acquired through a GPS , and information on locations of the reference objects acquired via vehicle-to-everything, V2X, communications, or from a database, and generating at least one distance error value by referring to the image-based reference distances (1-..) and
* the one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject vehicle, and (ii) calibrating subject location information of the subject vehicle by calculating an optimized location of the subject vehicle by minimizing the distance error value,
* wherein the process of (II) further comprises the processor calculating the image-based reference distances by applying at least one image distance estimation operation, which uses information on parameters of the camera, to the reference bounding boxes, and
* wherein (i) one or more (1-1)-st specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-1)-st specific image distance estimation operation, which further uses (1-1)-st specific reference object height information acquired through V2X communications between the subject vehicle and one or more (1-1)-st specific reference objects among the reference objects, to (1-1)-st specific reference bounding boxes, and (ii) one or more (1-2)-nd specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-2)-nd specific image distance estimation operation, which further uses information on a reference point in the reference image, to (1-2)-nd specific reference bounding boxes,
* wherein the (1-1)-st specific image distance estimation operation is performed by using the following formula: D=f×VHh
* wherein D denotes one of the (1-1)-st specific image-based reference distances, f denotes a focal length of the camera, VH denotes a piece of the (1-1)-st specific reference object height information corresponding to one of the (1-1)-st specific reference objects, and h denotes an apparent height of said one of the (1-1)-st specific reference objects on the reference image,
* wherein the (1-2)-nd specific image distance estimation operation is performed by using the following formula: D=f×Hb?cy
* wherein D denotes one of the (1-2)-nd specific image-based reference distances, f denotes a focal length of the camera, H denotes a height of the camera, and |b - cy| denotes an apparent distance, on the reference image, between the reference point and a lower boundary of one of the (1-2)-nd specific reference bounding boxes corresponding to one of (1-2)-nd specific reference objects, wherein the reference point is a center point of the reference image.
| 10. The device of Claim 9, wherein, at the process of (III), the processor generates the distance error value by using the following formula: ErrorD=?k=1NwkDik?Dck2 wherein Dik denotes one of the image-based reference distances corresponding to a k-th reference object among the reference objects, Dck denotes one of the coordinate-based reference distances corresponding to the k-th reference object, wk denotes a weight for the k-th object, and N denotes the number of the reference objects.
| 11. The device of Claim 9, wherein (i) one or more (2-1)-st specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-1)-st specific reference object location information acquired through V2X communications between the subject vehicle and one or more (2-1)-st specific reference objects among the reference objects, and (ii) one or more (2-2)-nd specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-2)-nd specific reference object location information which has been acquired from a database. | The method involves computing device detecting reference objects of the reference image (1) if at least one reference image is acquired through at least one camera on the subject vehicle which interworks with the subject vehicle by applying at least one object detection operation to the reference image. The computing device calculating image-based reference distances (2) between each of the reference objects and the subject vehicle corresponding to each of the reference objects by referring to information on each of reference bounding boxes on the reference image. The computing device, distance error value is generated by referring to the image-based reference distances and the coordinate-based reference distances, if each of one or more coordinate-based reference distances between each of the reference objects and the subject vehicle that is acquired. An INDEPENDENT CLAIM is included for a device for calculating exact location of a subject vehicle by using information on relative distances. Method for calculating exact location of a subject vehicle by using information on relative distances. Can also used for ego-vehicle localization by using the vehicle-to-everything (V2X) information fusion. A method calculates a more precise location of the ego-vehicle which allows information acquired by the ego-vehicle to be mapped onto the high definition (HD) map more correctly. The drawing shows a flowchart of the method for the ego-vehicle localization by using the V2X information fusion. 1Computing device detecting reference objects in the reference image2Computing device calculating image-based reference distances3Computing device calibrating subject location information |
Please summarize the input | LEARNING METHOD AND LEARNING DEVICE FOR DETERMINING WHETHER TO SWITCH MODE OF VEHICLE FROM MANUAL DRIVING MODE TO AUTONOMOUS DRIVING MODE BY PERFORMING TRAJECTORY-BASED BEHAVIOR ANALYSIS ON RECENT DRIVING ROUTEA learning method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver is provided. And the method includes steps of: (a) a learning device, on condition that a status vector and a trajectory vector are acquired, performing processes of (i) instructing a status network to generate a status feature map and (ii) instructing a trajectory network to generate a trajectory feature map; (b) the learning device instructing a safety network to calculate a predicted collision probability representing a predicted probability of an accident occurrence; and (c) the learning device instructing a loss layer to generate a loss by referring to the predicted collision probability and a GT collision probability, which have been acquired beforehand, to learn at least part of parameters.|1. A learning method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising steps of:
* (a) a learning device, on condition that (i) at least one status vector, corresponding to at least one piece of circumstance information for verification including at least part of (i-1) at least one piece of subject motion information on at least one subject vehicle and (i-2) one or more pieces of surrounding motion information on at least part of one or more surrounding objects located closer than a threshold from the subject vehicle, in a subject time range for verification from a first timing to a T-th timing, and (ii) at least one trajectory vector, corresponding to at least one piece of route information for verification on at least one driving route driven by the subject vehicle in the subject time range for verification, are acquired, performing processes of (i) instructing (S01) a status network to apply at least one first neural network operation to the status vector, to thereby generate at least one status feature map and (ii) instructing (S02) a trajectory network to apply at least one second neural network operation to the trajectory vector, to thereby generate at least one trajectory feature map;
* (b) the learning device, if at least one concatenated feature map corresponding to the status feature map and the trajectory feature map is acquired, instructing (S03) a safety network to apply at least one third neural network operation to the concatenated feature map, to thereby calculate at least one predicted collision probability representing a predicted probability of an occurrence of at least one accident caused by the driving route indicated by the route information for verification with regard to a circumstance indicated by the circumstance information for verification, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode; and
* (c) the learning device instructing (S04) a loss layer to generate at least one loss by referring to the predicted collision probability and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network, wherein, at the step of (b), the learning device instructs (i) at least one concatenating layer to generate the concatenated feature map by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer of the safety network to generate at least one (3-1)-st feature map by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer of the safety network to generate at least one (3-2)-nd feature map by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer to generate the predicted collision probability by applying at least one third FC operation to the (3-2)-nd feature map.
| 2. The method of Claim 1, wherein, at the step of (a), the learning device instructs (i) at least one first convolutional layer of the status network to generate at least one (1-1)-st feature map by applying at least one first convolutional operation to the status vector, (ii) at least one first pooling layer of the status network to generate at least one (1-2)-nd feature map by applying at least one first pooling operation to the (1-1)-st feature map, and (iii) at least one first Fully-Connected, FC, layer to generate the status feature map by applying at least one first FC operation to the (1-2)-nd feature map.
| 3. The method of Claim 1, wherein, at the step of (a), the learning device instructs (i) at least one second convolutional layer of the trajectory network to generate at least one (2-1)-st feature map by applying at least one second convolutional operation to the trajectory vector, (ii) at least one second pooling layer of the trajectory network to generate at least one (2-2)-nd feature map by applying at least one second pooling operation to the (2-1)-st feature map, and (iii) at least one second Fully-Connected, FC, layer to generate the trajectory feature map by applying at least one second FC operation to the (2-2)-nd feature map.
| 4. The method of any one of Claims 1 to 3, before the step of (a), further comprising a step of:
(a0) the learning device communicating with at least one basement server interworking with the subject vehicle, to perform processes of (i) generating the status vector by using the circumstance information for verification, including (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, corresponding to the subject time range for verification, which have been acquired from the basement server, (ii) generating the trajectory vector by using the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired from the basement server, and (iii) acquiring the GT collision probability by using at least one piece of accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired from the basement server, wherein K is at least one arbitrary integer.
| 5. The method of any one of Claims 1 to 4, before the step of (a), further comprising a step of:
(a1) the learning device performs processes of (i) generating the status vector by using the circumstance information for verification corresponding to the subject time range for verification, including (i-1) at least one piece of subject location information of the subject vehicle and (i-2) at least one piece of subject velocity information thereof, which have been acquired by referring to at least one piece of driving record information of the subject vehicle, and (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, which have been acquired by referring to at least one driving video recorded through at least one subject camera on the subject vehicle during the subject time range for verification, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired by referring to the driving record information, and (iii) acquiring the GT collision probability by using accident information on whether the subject vehicles has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired by referring to the driving record information, wherein K is at least one arbitrary integer.
| 6. The method of any one of Claims 1 to 5, before the step of (a), further comprising a step of:
(a2) the learning device performs processes of (i) generating the status vector by using the circumstance information for verification corresponding to the subject time range for verification, including (i-1) at least one piece of subject location information of the subject vehicle and (i-2) at least one piece of subject velocity information thereof, which have been acquired by referring to at least one piece of driving record information of the subject vehicle, and (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, which have been acquired by using at least one V2X communication module installed to the subject vehicle, to be used for communicating with said at least part of the surrounding vehicles, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired by referring to the driving record information, and (iii) acquiring the GT collision probability by using accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired by referring to the driving record information, wherein K is at least one arbitrary integer.
| 7. The method of any one of Claims 1 to 6, before the step of (a), further comprising a step of:
(a3) the learning device communicating with at least one simulating device simulating at least one virtual world including the subject vehicle and the surrounding objects, to perform processes of (i) generating the status vector by using the circumstance information for verification, including (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, corresponding to the subject time range for verification, which have been acquired from the simulating device, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired from the simulating device, and (iii) acquiring the GT collision probability by using at least one piece of accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired from the simulating device, wherein K is at least one arbitrary integer.
| 8. The method of any one of Claims 1 to 7, wherein the subject motion information includes at least part of (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, and (i-3) at least one piece of subject acceleration information thereof, and
wherein the surrounding motion information includes at least part of (ii-1) at least one piece of surrounding location information of at least part of the surrounding objects, (ii-2) at least one piece of surrounding velocity information thereof, and (ii-3) at least one piece of surrounding acceleration information thereof.
| 9. A testing method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising steps of:
* (a) on condition that (1) a learning device, if (i) at least one status vector for training, corresponding to at least one piece of circumstance information for verification for training including at least part of (i-1) at least one piece of subject motion information for training on at least one subject vehicle for training and (i-2) one or more pieces of surrounding motion information for training on at least part of one or more surrounding objects for training located closer than a threshold from the subject vehicle for training, in a subject time range for verification for training from a first timing to a T-th timing, and (ii) at least one trajectory vector for training, corresponding to at least one piece of route information for verification for training on at least one driving route for training driven by the subject vehicle for training in the subject time range for verification for training, have been acquired, has performed processes of (i) instructing a status network to apply at least one first neural network operation to the status vector for training, to thereby generate at least one status feature map for training and (ii) instructing a trajectory network to apply at least one second neural network operation to the trajectory vector for training, to thereby generate at least one trajectory feature map for training; (2) the learning device, if at least one concatenated feature map for training corresponding to the status feature map for training and the trajectory feature map for training has been acquired, has instructed a safety network to apply at least one third neural network operation to the concatenated feature map for training, to thereby calculate at least one predicted collision probability for training representing a predicted probability for training of an occurrence of at least one accident for training caused by the driving route for training indicated by the route information for verification for training with regard to a circumstance for training indicated by the circumstance information for verification for training, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode; and (3) the learning device has instructed a loss layer to generate at least one loss by referring to the predicted collision probability for training and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network, wherein, the learning device instructs (i) at least one concatenating layer to generate the concatenated feature map for training by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer of the safety network to generate at least one (3-1)-st feature map for training by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer of the safety network to generate at least one (3-2)-nd feature map for training by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer to generate the predicted collision probability for training by applying at least one third FC operation to the (3-2)-nd feature map, a testing device, if (i) at least one status vector for testing, corresponding to at least one piece of test circumstance information for verification including at least part of (i-1) at least one piece of subject motion information for testing on at least one subject vehicle for testing and (i-2) one or more pieces of surrounding motion information for testing on at least part of one or more surrounding objects for testing located closer than the threshold from the subject vehicle for testing, in a test subject time range for verification from a 1'-st timing to a T'-th timing, and (ii) at least one trajectory vector for testing, corresponding to at least one piece of test route information for verification on at least one driving route for testing driven by the subject vehicle for testing in the test subject time range for verification, have been acquired, performing processes of (i) instructing the status network to apply said at least one first neural network operation to the status vector for testing, to thereby generate at least one status feature map for testing and (ii) instructing the trajectory network to apply said at least one second neural network operation to the trajectory vector for testing, to thereby generate at least one trajectory feature map for testing;
* (b) the testing device, if at least one concatenated feature map for testing corresponding to the status feature map for testing and the trajectory feature map for testing has been acquired, instructing the safety network to apply said at least one third neural network operation to the concatenated feature map for testing, to thereby calculate at least one predicted collision probability for testing representing a predicted probability for testing of an occurrence of at least one accident for testing caused by the driving route for testing indicated by the test route information for verification with regard to a circumstance for testing indicated by the test circumstance information for verification.
| 10. The method of Claim 9, wherein, at the step of (a), the testing device communicates with at least one basement server for testing interworking with the subject vehicle for testing, to perform processes of (i) generating the status vector for testing by using the test circumstance information for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing, (i-2) at least one piece of subject velocity information for testing thereof, (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, corresponding to the test subject time range for verification, which have been acquired from the basement server for testing, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from the basement server for testing.
| 11. The method of Claim 9 or 10, wherein, at the step of (a), the testing device performs processes of (i) generating the status vector for testing by using the test circumstance information for verification corresponding to the test subject time range for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing and (i-2) at least one piece of subject velocity information for testing thereof, which have been acquired from at least one of a GPS for testing and a velocity control unit for testing included in the subject vehicle for testing, and (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, which have been acquired by referring to at least one driving video for testing recorded through at least one subject camera for testing on the subject vehicle for testing during the test subject time range for verification, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from a planning unit for testing included in the subject vehicle for testing.
| 12. The method of Claim 9, 10 or 11, wherein, at the step of (a), the testing device performs processes of (i) generating the status vector for testing by using the test circumstance information for verification corresponding to the test subject time range for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing and (i-2) at least one piece of subject velocity information for testing thereof, which have been acquired from at least one of a GPS for testing and a velocity control unit for testing included in the subject vehicle for testing, and (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, which have been acquired by using a V2X communication module for testing included in the subject vehicle of testing during the test subject time range for verification, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from a planning unit for testing included in the subject vehicle for testing.
| 13. The method of any one of Claims 9 to 12, further comprising a step of:
(c) the testing device, if the predicted collision probability for testing is larger than a threshold and a driving mode of the subject vehicle for testing corresponds to a manual driving mode, instructing the subject vehicle for testing to switch its driving mode to an autonomous driving mode.
| 14. A learning device for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising:
* at least one memory (115) that stores instructions; and
* at least one processor (120) configured to execute the instructions to perform processes of: (I) on condition that (i) at least one status vector, corresponding to at least one piece of circumstance information for verification including at least part of (i-1) at least one piece of subject motion information on at least one subject vehicle and (i-2) one or more pieces of surrounding motion information on at least part of one or more surrounding objects located closer than a threshold from the subject vehicle, in a subject time range for verification from a first timing to a T-th timing, and (ii) at least one trajectory vector, corresponding to at least one piece of route information for verification on at least one driving route driven by the subject vehicle in the subject time range for verification, are acquired, performing processes of (i) instructing a status network (130) to apply at least one first neural network operation to the status vector, to thereby generate at least one status feature map and (ii) instructing a trajectory network (140) to apply at least one second neural network operation to the trajectory vector, to thereby generate at least one trajectory feature map; (II) if at least one concatenated feature map corresponding to the status feature map and the trajectory feature map is acquired, instructing a safety network (150) to apply at least one third neural network operation to the concatenated feature map, to thereby calculate at least one predicted collision probability representing a predicted probability of an occurrence of at least one accident caused by the driving route indicated by the route information for verification with regard to a circumstance indicated by the circumstance information for verification, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode, wherein, the processor (120) is further configured to instruct (i) at least one concatenating layer to generate the concatenated feature map by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer (152) of the safety network (150) to generate at least one (3-1)-st feature map by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer (153) of the safety network (150) to generate at least one (3-2)-nd feature map by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer (154) to generate the predicted collision probability by applying at least one third FC operation to the (3-2)-nd feature map; and (III) instructing a loss layer (160) to generate at least one loss by referring to the predicted collision probability and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network. | The learning method for calculating collision probability and switch manual mode to autonomous mode in vehicle involves Instructing (S01) status network to apply first neural network operation to status vector to generate status feature map. The trajectory network is instructed (S02) to apply second neural network operation to trajectory vector and generate trajectory feature map. The safety network is instructed (S03) to calculate predicted collision Probability by applying the neural network operation to concatenate feature map. The lost layer is instructed (S04) to generate loss and perform back propagation by using loss. The learning device instructs loss layer to generate predicted collision probability to perform back-propagation using trajectory and status network. An INDEPENDENT CLAIM is included for: a learning device for calculating collision probability. Learning method in autonomous vehicle for automatic switching of driving modes based on behavior analysis and driving style of user on recent driving route using vehicle to everything (V2X) communication network. The method calculates collision probability by analyzing a recent driving route of a driver, to determine whether the driver is driving dangerously and switch manual mode to autonomous driving mode. The learning device instruct the loss layer to generate at least one loss to predicted collision probability and perform back-propagation using the parameters of the safety network. Increase the safety of the driver and reduces the collision dangers to surrounding vehicles. The drawing shows a flow-chart of a method for calculating collision probability, to be used for determining whether it is appropriate to switch driving modes to autonomous by analyzing the recent driving route of the driver. S01Instructing status network to generate status feature mapS02Instructing trajectory network to apply second trajectory feature mapS03Instructing safety network to calculate predicted collision Probability to concatenate feature mapS04Instructing lost layer to generate loss and perform back propagation |
Please summarize the input | METHOD AND DEVICE FOR INTER-VEHICLE COMMUNICATION VIA RADAR SYSTEMA method for a V2V communication by using a radar module used for detecting objects nearby is provided. And the method includes steps of: (a) a computing device (100) performing (i) a process of instructing the radar module (130) to transmit 1-st transmitting signals by referring to at least one 1-st schedule and (ii) a process of generating RVA information by using (1-1)-st receiving signals, corresponding to the 1-st transmitting signals; and (b) the computing device performing a process of instructing the radar module to transmit 2-nd transmitting signals by referring to at least one 2-nd schedule.|1. A method for a V2V communication by using a radar module used for detecting objects nearby, comprising steps of: (a) a computing device, if a 1-st trigger that a 1-st timing corresponding to a current time is included in a 1-st time slot is detected, performing (i) a process of instructing the radar module to transmit one or more 1-st transmitting signals, to be used for acquiring RVA information on at least part of one or more head directions, one or more relative locations and one or more relative velocities of at least part of one or more 1-st surrounding objects including one or more 1-st surrounding vehicles located closer than a first threshold from a subject vehicle interworking with the computing device at the 1-st timing, by referring to at least one 1-st schedule and (ii) a process of generating the RVA information by using one or more (1-1)-st receiving signals, corresponding to the 1-st transmitting signals, acquired through the radar module; and (b) the computing device, if a 2-nd trigger that a 2-nd timing corresponding to a later time from the 1-st timing is included in a 2-nd time slot is detected, performing a process of instructing the radar module to transmit one or more 2-nd transmitting signals by referring to at least one 2-nd schedule, corresponding to V2V transmitting information on the subject vehicle.
| 2. The method of Claim 1, wherein, at the step of (a), the computing device, if one or more (1-2)-nd receiving signals are acquired through the radar module from at least part of the 1-st surrounding objects, further performs a process of generating 1-st V2V receiving information by referring to the (1-2)-nd receiving signals in parallel with the other processes performed during the step of (a), and wherein, at the step of (b), the computing device, if one or more 2-nd receiving signals are acquired through the radar module from at least part of 2-nd surrounding vehicles which are located closer than the first threshold from the subject vehicle at the 2-nd timing, performs a process of generating 2-nd V2V receiving information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd receiving signals in parallel with the other process performed during the step of (b).
| 3. The method of Claim 2, wherein, at the step of (a), the computing device (i) detects at least one starting signal and at least one ending signal among the (1-2)-nd receiving signals by referring to a reference length included in a communication rule of the V2V communication, (ii) generates at least one meta data permutation including at least part of the (1-2)-nd receiving signals between the starting signal and the ending signal, and then (iii) generates the 1-st V2V receiving information by referring to the meta data permutation.
| 4. The method of Claim 3, wherein, at the step of (a), the computing device, if a time gap between receiving timings of its two inputted receiving signals is smaller than a second threshold and thus if the inputted receiving signals are not determined as being included in the (1-1)-st receiving signals or as being included in the (1-2)-nd receiving signals, instructs the radar module to continuously receive receiving signals until the meta data permutation is generated.
| 5. The method of Claim 3, wherein, at the step of (a), the computing device compares the meta data permutation and each of one or more reference data permutations corresponding to each of driving circumstances, to find a specific reference data permutation whose similarity score with the meta data permutation is larger than a third threshold, to thereby generate the 1-st V2V receiving information by referring to information on a specific driving circumstance corresponding to the specific reference data permutation.
| 6. The method of Claim 2, wherein, at the step of (a), the computing device generates 1-st circumstance information on at least part of the 1-st surrounding vehicles by referring to the 1-st V2V receiving information, to thereby support an autonomous driving of the subject vehicle by referring to the 1-st circumstance information, and wherein, at the step of (b), the computing device generates 2-nd circumstance information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd V2V receiving information, to thereby support the autonomous driving of the subject vehicle by referring to the 2-nd circumstance information, wherein (2-1)-st circumstance information among the 2-nd circumstance information is acquired by updating at least part of the 1-st circumstance information on one or more specific vehicles included in both of the 1-st surrounding vehicles and the 2-nd surrounding vehicles, using at least part of the 2-nd V2V receiving information thereon, and wherein (2-2)-nd circumstance information on another vehicles among the 2-nd surrounding vehicles other than the specific vehicles is acquired by another part of the 2-nd V2V receiving information.
| 7. The method of Claim 2, wherein, at the step of (b), the computing device transmits the 2-nd transmitting signals with its transmitting timings determined by referring to the 2-nd schedule to thereby deliver the V2V transmitting information to at least part of the 2-nd surrounding vehicles, wherein the 2-nd schedule has been acquired by referring to a specific reference data permutation corresponding to the V2V transmitting information among each of one or more reference data permutations for each of driving circumstances and a reference length included in a communication rule of the V2V communication.
| 8. The method of Claim 2, wherein, at the step of (a), the computing device, if a time gap between receiving timings of its two inputted receiving signals is larger than or same as a second threshold, (i) generates Intermediate Frequency(IF) signals between one of the 1-st transmitting signals and said inputted receiving signals, (ii) determines whether each of center frequencies of each of the IF signals is included in a 1-st frequency range or is included in a 2-nd frequency range, to thereby determine whether each of said inputted receiving signals is included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals.
| 9. The method of Claim 2, wherein the computing device uses a frequency interference prevention filter to acquire the (1-1)-st receiving signals and the (1-2)-nd receiving signals at a timing included in the 1-st time slot, and wherein the computing device does not use the frequency interference prevention filter to acquire the 2-nd receiving signals at a timing included in the 2-nd time slot.
| 10. The method of Claim 1, wherein, at the step of (a), as the computing device instructs the radar module to transmit the 1-st transmitting signals built as chirp signals, the computing device (i) acquires the (1-1)-st receiving signals, which are reflected signals of the 1-st transmitting signals, through the radar module, (ii) generates each of one or more Intermediate Frequency(IF) signals between each of the 1-st transmitting signals and each of the (1-1)-st receiving signals, and (iii) generates the RVA information by applying a Fourier transform to the IF signals.
| 11. A computing device for a V2V communication by using a radar module used for detecting objects nearby, comprising: at least one memory that stores instructions; and at least one processor configured to execute the instructions to perform processes of: (I) if a 1-st trigger that a 1-st timing corresponding to a current time is included in a 1-st time slot is detected, (i) instructing the radar module to transmit one or more 1-st transmitting signals, to be used for acquiring RVA information on at least part of one or more head directions, one or more relative locations and one or more relative velocities of at least part of one or more 1-st surrounding objects including one or more 1-st surrounding vehicles located closer than a first threshold from a subject vehicle interworking with the computing device at the 1-st timing, by referring to at least one 1-st schedule and (ii) generating the RVA information by using one or more (1-1)-st receiving signals, corresponding to the 1-st transmitting signals, acquired through the radar module; and (II) if a 2-nd trigger that a 2-nd timing corresponding to a later time from the 1-st timing is included in a 2-nd time slot is detected, instructing the radar module to transmit one or more 2-nd transmitting signals by referring to at least one 2-nd schedule, corresponding to V2V transmitting information on the subject vehicle.
| 12. The computing device of Claim 11, wherein, at the process of (I), the processor, if one or more (1-2)-nd receiving signals are acquired through the radar module from at least part of the 1-st surrounding objects, further performs a process of generating 1-st V2V receiving information by referring to the (1-2)-nd receiving signals in parallel with the process of (I), and wherein, at the process of (II), the processor, if one or more 2-nd receiving signals are acquired through the radar module from at least part of 2-nd surrounding vehicles which are located closer than the first threshold from the subject vehicle at the 2-nd timing, performs a process of generating 2-nd V2V receiving information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd receiving signals in parallel with the process of (II).
| 13. The computing device of Claim 12, wherein, at the process of (I), the processor (i) detects at least one starting signal and at least one ending signal among the (1-2)-nd receiving signals by referring to a reference length included in a communication rule of the V2V communication, (ii) generates at least one meta data permutation including at least part of the (1-2)-nd receiving signals between the starting signal and the ending signal, and then (iii) generates the 1-st V2V receiving information by referring to the meta data permutation.
| 14. The computing device of Claim 13, wherein, at the process of (I), the processor, if a time gap between receiving timings of its two inputted receiving signals is smaller than a second threshold so that the inputted receiving signals cannot be distinguished as being included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals, instructs the radar module to continuously receive receiving signals until the meta data permutation is generated.
| 15. The computing device of Claim 13, wherein, at the process of (I), the processor compares the meta data permutation and each of one or more reference data permutations corresponding to each of driving circumstances, to find a specific reference data permutation whose similarity score with the meta data permutation is larger than a third threshold, to thereby generate the 1-st V2V receiving information by referring to information on a specific driving circumstance corresponding to the specific reference data permutation.
| 16. The computing device of Claim 12, wherein, at the process of (I), the processor generates 1-st circumstance information on at least part of the 1-st surrounding vehicles by referring to the 1-st V2V receiving information, to thereby support an autonomous driving of the subject vehicle by referring to the 1-st circumstance information, and wherein, at the process of (II), the processor generates 2-nd circumstance information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd V2V receiving information, to thereby support the autonomous driving of the subject vehicle by referring to the 2-nd circumstance information, wherein (2-1)-st circumstance information among the 2-nd circumstance information is acquired by updating at least part of the 1-st circumstance information on one or more specific vehicles included in both of the 1-st surrounding vehicles and the 2-nd surrounding vehicles, using at least part of the 2-nd V2V receiving information thereon, and wherein (2-2)-nd circumstance information on another vehicles among the 2-nd surrounding vehicles other than the specific vehicles is acquired by another part of the 2-nd V2V receiving information.
| 17. The computing device of Claim 12, wherein, at the process of (II), the processor transmits the 2-nd transmitting signals with its transmitting timings determined by referring to the 2-nd schedule to thereby deliver the V2V transmitting information to at least part of the 2-nd surrounding vehicles, wherein the 2-nd schedule has been acquired by referring to a specific reference data permutation corresponding to the V2V transmitting information among each of one or more reference data permutations for each of driving circumstances and a reference length included in a communication rule of the V2V communication.
| 18. The computing device of Claim 12, wherein, at the process of (I), the processor, if a time gap between receiving timings of its two inputted receiving signals is larger than or same as a second threshold, (i) generates Intermediate Frequency(IF) signals between one of the 1-st transmitting signals and said inputted receiving signals, (ii) determines whether each of center frequencies of each of the IF signals is included in a 1-st frequency range or is included in a 2-nd frequency range, to thereby determine whether each of said inputted receiving signals is included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals.
| 19. The computing device of Claim 12, wherein the processor uses a frequency interference prevention filter to acquire the (1-1)-st receiving signals and the (1-2)-nd receiving signals at a timing included in the 1-st time slot, and wherein the processor does not use the frequency interference prevention filter to acquire the 2-nd receiving signals at a timing included in the 2-nd time slot.
| 20. The computing device of Claim 11, wherein, at the process of (I), as the processor instructs the radar module to transmit the 1-st transmitting signals built as chirp signals, the processor (i) acquires the (1-1)-st receiving signals, which are reflected signals of the 1-st transmitting signals, through the radar module, (ii) generates each of one or more Intermediate Frequency(IF) signals between each of the 1-st transmitting signals and each of the (1-1)-st receiving signals, and (iii) generates the RVA information by applying a Fourier transform to the IF signals. | The method involves performing a process of instructing (S01-1) the radar module to transmit 1-st transmitting signals to be used for acquiring RVA information on head directions, relative locations and relative velocities of 1-st surrounding objects including 1-st surrounding vehicles which are located closer than a first threshold from a subject vehicle interworking with the computing device at 1-st timing by referring to 1-st schedule if a 1-st trigger that a 1-st timing corresponding to a current time. A process of generating (S01-2) RVA information by using (1-1)-st receiving signals corresponding to the 1-st transmitting signals are acquired through radar module. A process of instructing (S02-1) radar module is performed to transmit 2-nd transmitting signals by referring to 2-nd schedule corresponding to V2V transmitting information on subject vehicle if a 2-nd trigger that a 2-nd timing corresponding to a later time from 1-st timing is included in a 2-nd time slot is detected. An INDEPENDENT CLAIM is included for a computing device for a V2V communication. Method for vehicle-to-vehicle (V2V) communication. The radar module is allowed to perform the V2V communication while performing original functions by transmitting different signals according to time slots. The drawing shows a flowchart illustrating the method for the V2V communication by using the radar module used for detecting objects nearby. S01-1Step for instructing the radar module to transmit 1-st transmitting signalsS01-2Step for generating the RVA informationS01-3Step for generating 1-st V2V receiving informationS02-1Step for instructing the radar module to transmit 2-nd transmitting signalsS02-2Step for generating 2-nd V2V receiving information |
Please summarize the input | METHOD AND DEVICE FOR SIGNALING PRESENT DRIVING INTENSION OF AUTONOMOUS VEHICLE TO HUMANS BY USING VARIOUS V2X-ENABLED APPLICATIONThe present invention provides a method for signaling a driving intention of an autonomous vehicle, wherein a driving intention signaling device (a) autonomously using a surrounding video image (Surrounding Video Image) detecting a pedestrian located in front of the driving vehicle and determining whether the pedestrian crosses the road using a virtual crosswalk; (b) If it is determined that the pedestrian is crossing the road, the crossing trajectory corresponding to the expected path of the pedestrian is predicted by referring to the moving trajectory of a specific pedestrian, and the driving information and the crossing trajectory are referenced. setting a driving plan for the autonomous vehicle, and allowing the autonomous vehicle to autonomously drive according to the driving plan; and (c) determining whether a specific pedestrian is paying attention to the autonomous vehicle by referring to the gaze pattern, and if not, transmitting the driving intention to the pedestrian and surrounding drivers through an external display and an external speaker. Provides a method including ;.|1. A method of signaling at least one driving intention of an autonomous vehicle, comprising: (a) a Driving Intention Signaling Device, at least one peripheral image image of the autonomous vehicle ( A process of detecting at least one pedestrian located in a nearby front area of the autonomous vehicle using a Surrounding Video Image, and a virtual crosswalk corresponding to one of the positions of the pedestrian among the pedestrians. performing a process of determining whether a specific pedestrian crosses a road on which the autonomous vehicle travels;
(b) when it is determined that the specific pedestrian crosses the road, the driving intention signaling device refers to at least one moving trajectory of the specific pedestrian, and at least the specific pedestrian attempts to cross the road. A process of predicting at least one crosswalking trajectory corresponding to one predicted path, and at least one driving plan of the autonomous vehicle by referring to the driving information of the autonomous vehicle and the trajectory. performing a setting process and a process for allowing the autonomous vehicle to drive according to the driving plan; and (c) the driving intention signaling device determines whether the specific pedestrian is paying attention to the autonomous vehicle by referring to at least one gaze pattern of the specific pedestrian by using the surrounding image image. process, and when it is determined that the specific pedestrian is not paying attention to the autonomous vehicle, the driving intention of the autonomous vehicle corresponding to the driving plan through at least one of an external display and an external speaker installed in the autonomous vehicle performing a process to be delivered to at least one of the specific pedestrian and at least one driver of at least one surrounding vehicle;
Including, in step (a), the driving intention signaling device, (i) corresponding to the road width of the road, both sides from each boundary line between the road and each sidewalk in each direction of the sidewalk create the virtual crosswalk to include a first area extending a distance and a second area being an additional area selected with reference to each predetermined point on each first area, (ii) the virtual crosswalk; a process in which the longitudinal central axis of the sidewalk corresponds to the location of the specific pedestrian, and (iii) determining that the specific pedestrian is crossing the road when the specific pedestrian is located in one of the first areas; and performing a process of determining that the specific pedestrian intends to cross the road when the specific pedestrian is located in one of the second areas.
| 2. The method according to claim 1, wherein, in step (c), the driving intention signaling device means a time for which the gaze of the specific pedestrian is located in the autonomous vehicle during (i) a preset reference time When the cumulative sum of each of at least one Gaze Time Section is less than or equal to a preset first threshold, or (ii) a state in which the gaze of the specific pedestrian is located in the autonomous vehicle and a state in which it is not repeated are repeated When the time period during which the gaze of the specific pedestrian is not located in the autonomous vehicle is greater than or equal to a preset second threshold after A method characterized in that performing a process for determining that there is not.
| 3. According to claim 2, wherein the driving intention signaling device, (iii) at least one of each of the gaze time section, which means a time during which the gaze of the specific pedestrian is located in the autonomous vehicle during the preset reference time. When the cumulative sum exceeds the first preset threshold, or (iv) the state in which the gaze of the specific pedestrian is located in the autonomous vehicle and the state in which it is not located in the autonomous vehicle are repeated, the gaze of the specific pedestrian becomes the autonomous driving When the time in the vehicle is equal to or greater than a third threshold, a process of determining that the specific pedestrian is paying attention to the autonomous vehicle is performed.
| 4. delete
| 5. The primary space of claim 1, wherein, in step (b), the driving intention signaling device refers to the movement speed and acceleration of the specific pedestrian, (i) a second distance or more in the direction of the road from the crossing end point. In the Primary Spatial Section, a constant acceleration model with limited maximum speed is used, and (ii) in a second spatial section that is less than the second distance from the crossing end point in the direction of the road, the direction of movement of the specific pedestrian is maintained, but Method of performing a process of predicting the crossing trajectory of the specific pedestrian by using the constant acceleration model according to the set negative acceleration.
| 6. The method of claim 1, wherein the driving intention signaling device causes the autonomous vehicle to avoid the traversing trajectory of the specific pedestrian, Method of performing a process of setting the driving plan by predicting a driving route, acceleration information, deceleration information, and steering information so that the specific pedestrian does not feel threatened or violates traffic laws.
| 7. The state of claim 6 , wherein a first action plan to a j-th action plan corresponding to at least a part of an acceleration operation, a deceleration operation, and a steering operation of the autonomous vehicle, where j is an integer greater than or equal to 1, are set. , n steps at a preset time interval - where n is an integer greater than or equal to 1 - In the set state, the driving intention signaling device, in the (k-1)-th step - where k is an integer greater than or equal to 1 and less than or equal to n - predicted in step When each of the first action plan to the j-th action plan is performed based on the (k-1)th driving information of the autonomous vehicle corresponding to the (k-1)th optimal action plan Pedestrian injury cost, pedestrian threat cost, law violation cost, and repeating the process of predicting a specific action plan that minimizes the ride comfort cost as a k-th optimal action plan in the k-th step, so that each second action plan selected from the first action plan to the j-th action plan in each of the n steps A method of performing a process of setting the driving plan of the autonomous vehicle with reference to a first optimal action plan to an n-th optimal action plan.
| 8. The method of claim 1, wherein in step (c), the driving intention signaling device displays at least a portion of a color, a symbol, a text, and an emoji corresponding to the driving intention through the external display. A method comprising performing a process for displaying.
| 9. The autonomous vehicle according to claim 1, wherein in step (c), the driving intention signaling device refers to at least one gaze pattern of the occupant by using at least one interior image of the autonomous vehicle. a process of determining whether at least one said occupant of a driving vehicle is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle, and the occupant is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle When it is determined that the vehicle is not in operation, a process of transmitting the driving intention of the autonomous driving vehicle corresponding to the driving plan to the occupant is performed through at least one of an internal display and an internal speaker installed in the autonomous driving vehicle how to do it with
| 10. A driving intention signaling device for signaling at least one driving intention of an autonomous vehicle, comprising: at least one memory for storing instructions; and (I) a process of detecting at least one pedestrian located in a nearby forward area of the autonomous vehicle using at least one Surrounding Video Image of the autonomous vehicle, and a location of the pedestrian. A process of determining whether a specific pedestrian among the pedestrians crosses a road on which the autonomous vehicle travels by using a virtual crosswalk corresponding to (II) determining that the specific pedestrian is crossing the road , a process of predicting at least one crosswalking trajectory corresponding to at least one expected path on which the specific pedestrian intends to cross the road with reference to at least one moving trajectory of the specific pedestrian, a process of setting at least one driving plan of the autonomous vehicle with reference to the driving information of the autonomous vehicle and the traversing trajectory, and a process of allowing the autonomous vehicle to drive according to the driving plan; and (III) a process of determining whether the specific pedestrian is paying attention to the autonomous vehicle by referring to at least one gaze pattern of the specific pedestrian by using the surrounding image image; and If it is determined that the autonomous driving vehicle is not paying attention, the driving intention of the autonomous driving vehicle corresponding to the driving plan is determined through at least one of an external display and an external speaker installed in the autonomous driving vehicle. at least one processor configured to execute said instructions for performing a process to be communicated to at least one of at least one driver of a surrounding vehicle;
wherein, in the process (I), (i) corresponding to the road width of the road, both sides of the road by a first distance from each boundary line between the road and each sidewalk in each sidewalk direction create the virtual crosswalk to include a first area that extends and a second area that is an additional area selected with reference to each predetermined point on each first area; (ii) the length of the virtual crosswalk; a process of determining that the specific pedestrian is crossing the road when the specific pedestrian is located in one of the first areas, and (iii) a process of determining that the specific pedestrian is crossing the road; and performing a process of determining that the specific pedestrian intends to cross the road when the pedestrian is located in one of the second areas.
| 11. The method of claim 10, wherein in the process (III), the processor is configured to: (i) at least each of the at least When the cumulative sum of one Gaze Time Section is less than or equal to a preset first threshold, or (ii) after the state in which the gaze of the specific pedestrian is located in the autonomous vehicle and the state in which it is not located are repeated, the If the time period during which the gaze of the specific pedestrian is not located in the autonomous vehicle is equal to or greater than a preset second threshold, it is determined that the specific pedestrian is not paying attention to the autonomous vehicle with reference to the gaze pattern of the specific pedestrian. A device, characterized in that performing a process of determining.
| 12. The method of claim 11, wherein the processor is configured to: (iii) a cumulative sum of each of the at least one gaze time section, which means a time during which the gaze of the specific pedestrian is located in the autonomous vehicle during the preset reference time When the predetermined first threshold is exceeded, or (iv) the state in which the gaze of the specific pedestrian is positioned on the autonomous vehicle and the state in which it is not located are repeated, the gaze of the specific pedestrian is located in the autonomous vehicle and performing a process for determining that the specific pedestrian is paying attention to the autonomous vehicle when the time is equal to or greater than a third threshold.
| 13. delete
| 14. The primary space section (Primary) according to claim 10, wherein, in the process (II), the processor, with reference to the moving speed and acceleration of the specific pedestrian, (i) a second distance or more in the road direction from the crossing end point In the Spatial Section), a constant acceleration model with a limited maximum speed is used, and (ii) in a second spatial section that is less than the second distance from the crossing end point in the direction of the road, the direction of movement of the specific pedestrian is maintained but a preset negative value is used. and performing a process of predicting the crossing trajectory of the specific pedestrian by using the constant acceleration model according to the acceleration.
| 15. The method of claim 10 , wherein the processor causes the autonomous vehicle to avoid the crossing trajectory of the specific pedestrian, or The apparatus of claim 1, wherein a process of setting the driving plan is performed by predicting a driving route, acceleration information, deceleration information, and steering information so that the vehicle does not feel threatened or violates traffic laws.
| 16. The method according to claim 15, wherein a first action plan to a j-th action plan corresponding to at least a part of an acceleration operation, a deceleration operation, and a steering operation of the autonomous vehicle, where j is an integer greater than or equal to 1, are set. , n steps at a preset time interval - where n is an integer greater than or equal to 1 - In this set state, the processor is configured to: -1) The result of performing each of the first action plan to the j-th action plan based on the (k-1)th driving information of the autonomous vehicle corresponding to the Optimal Action Plan A process of predicting a specific action plan that minimizes pedestrian injury cost, pedestrian threat cost, law violation cost, and ride comfort cost among the first to j-th driving information of the driving information as the k-th optimal action plan in the k-th step By repeating performing a process of setting the driving plan of the autonomous vehicle with reference to each of the first optimal action plan to the n-th optimal action plan selected from the first action plan to the j-th action plan in each of the n steps Device characterized in that.
| 17. The method of claim 10, wherein in the process (III), the processor displays at least a part of a color, a symbol, a text, and an emoji corresponding to the driving intention through the external display. A device for performing a process.
| 11. The method of claim 10, wherein in the process (III), the processor refers to at least one gaze pattern of the occupant by using at least one interior image of the autonomous vehicle, a process of determining whether the at least one occupant is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle, and wherein the occupant is not paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle If it is determined that the vehicle is determined to be driving, a process of transmitting the driving intention of the autonomous driving vehicle corresponding to the driving plan to the occupant is performed through at least one of an internal display and an internal speaker installed in the autonomous driving vehicle . | The method involves detecting multiple pedestrians which is located in a nearby front area of the autonomous vehicle by using surroundings video image of the autonomous vehicle. A specific pedestrian is determined among the pedestrians crosses a roadway where the autonomous vehicle is traveling by using a virtual crosswalk corresponding to one of locations of the pedestrians. The crosswalking trajectory is estimated corresponding to the expected path by which the specific pedestrian is to cross the roadway. The specific pedestrian is paying attention to the autonomous vehicle is determined by referring to multiple gaze patterns of the specific pedestrian using the surroundings video image. An INDEPENDENT CLAIM is included for a driving intention signaling device for signaling at least one driving intention of an autonomous vehicle. Method for signaling a driving intention of an autonomous vehicle. The method allows the pedestrian or the passenger to recognize a current driving intention of the autonomous vehicle by acquiring optimal information to be used for signaling the current driving intention of the autonomous vehicle and allowing the pedestrian, the passenger, the drivers of other vehicles to calm down, figure out and expect behaviors of the autonomous vehicle. The drawing shows a block representation of a driving intention signaling device. 100Driving intention signaling device110Memory120Processor |
Please summarize the input | Autonomous vehicle technology effectiveness determination for insurance pricingMethods and systems for determining the effectiveness of one or more autonomous (and/or semi-autonomous) operation features of a vehicle are provided. According to certain aspects, information regarding autonomous operation features of the vehicle may be used to determine an effectiveness metric indicative of the ability of each autonomous operation feature to avoid or mitigate accidents or other losses. The information may include operating data from the vehicle or other vehicles having similar autonomous operation features, test data, or loss data from other vehicles. The determined effectiveness metric may then be used to determine part or all of an insurance policy, which may be reviewed by an insured and updated based upon the effectiveness metric.What is claimed is:
| 1. A computer-implemented method for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the method comprising:
implementing, by a test computing system, the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
presenting, by the test computing system, virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data, generating, by the test computing system, test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and
based upon the test responses, determining, by the test computing system, an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies.
| 2. The computer-implemented method of claim 1, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
determining the effectiveness metric includes determining an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 3. The computer-implemented method of claim 2, wherein determining the update to the effectiveness metric includes determining a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 4. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 5. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 6. The computer-implemented method of claim 1, wherein:
determining the effectiveness metric includes generating a plurality of effectiveness metrics associated with a plurality of vehicle types.
| 7. The computer-implemented method of claim 1, wherein:
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.
| 8. A computer system for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the system comprising:
a test computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the test computing system to:
implement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
present virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and
based upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies.
| 9. The computer system of claim 8, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
the executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 10. The computer system of claim 9, wherein the executable instructions that cause the test computing system to determine the update to the effectiveness metric further cause the test computing system to determine a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 11. The computer system of claim 8, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 12. The computer system of claim 8, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 13. The computer system of claim 8, wherein:
the executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types.
| 14. The computer system of claim 8, wherein:
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.
| 15. A non-transitory computer-readable medium storing instructions for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies that, when executed by at least one processor of a computer system, cause the computer system to:
implement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
present virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and
based upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies.
| 16. The computer-readable medium of claim 15, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
the executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 17. The computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 18. The computer-readable medium of claim 15, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 19. The computer-readable medium of claim 15,
the executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types.
| 20. The computer-readable medium of claim 15,
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. | The method involves implementing the multiple of autonomous or semi-autonomous vehicle technologies within a virtual test environment by a test computing system. The virtual test sensor data is presented to the virtual test environment, such that the virtual test sensor data simulates sensor data for operating the conditions associated with a multiple of test scenarios within the virtual test environment. The test responses of the multiple of autonomous or semi-autonomous vehicle technologies are generated in response to the virtual test sensor data. An effectiveness metric for the multiple of autonomous or semi-autonomous vehicle technologies are determined based upon the test responses. The effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the multiple of autonomous or semi-autonomous vehicle technologies. INDEPENDENT CLAIMS are included for the following :a computer system for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies anda computer-readable medium for storing instructions for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies. Method for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies. The information to the vehicle operator may improve the effective use of the autonomous operation features and reduce the risks associated with vehicle operation. The drawing shows a block diagram of an exemplary computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation. 100Autonomous Vehicle Insurance System102Front End Components104Back-end Components110Mobile Device112Links |
Please summarize the input | Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is:
| 1. A computer-implemented method of generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile, the method comprising:
detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle;
generating, via the one or more processors associated with the first vehicle, an electronic message regarding the abnormal traffic condition;
transmitting, via a vehicle-mounted transceiver associated with the first vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receiving, via the one or more processors associated with the first vehicle, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
transmitting, via the one or more processors associated with the first vehicle, the telematics data to a remote server, wherein the remote server updates a vehicle-usage profile associated with the nearby vehicle.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.
| 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component:
analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and
performs an action based on the type of anomalous condition in order to modify the anomalous condition.
| 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and wherein the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 11. A computer system configured to generate a vehicle-to-vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to:
detect that an abnormal traffic condition exists in an operating environment of a first vehicle;
generate an electronic message regarding the abnormal traffic condition;
transmit, via a vehicle-mounted transceiver associated with the first vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
transmit the telematics data to a remote server, wherein the remote server updates a vehicle-usage profile associated with the nearby vehicle.
| 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 15. The computer system of claim 11, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 16. The computer system of claim 11, wherein the one or more processors is one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the first vehicle. | The method involves detecting that an abnormal traffic condition exists in an operating environment of the first vehicle, generating an electronic message regarding the abnormal traffic condition, transmitting the electronic message to a nearby vehicle, in which the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition, receiving telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmitting the telematics data to a remote server via the processors associated with the first vehicle, in which the remote server updates a vehicle-usage profile associated with the nearby vehicle. An INDEPENDENT CLAIM is also included for a computer system. Computer-implemented method of generating vehicle-to-vehicle traffic alert and updating vehicle-usage profile. Helps improve driving behavior by providing for feedback to the evaluated driver. Saves processing power and battery life since the second computing device ignores the telematics data. The drawing shows the block diagram of the telematics collection system. 100Telematics collection system106External computing device108Vehicle110Computing device114On-board computer |
Please summarize the input | Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is:
| 1. A computer-implemented method of generating a vehicle traffic alert, the method comprising:
detecting, via one or more processors, that an abnormal traffic condition exists in a vehicle operating environment;
generating, via the one or more processors, an electronic message regarding the abnormal traffic condition;
transmitting the electronic message to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition, the abnormal traffic condition having already occurred in the vehicle operating environment, wherein the type of anomalous condition is selected from at least one of a set of transient conditions or non-transient conditions, and wherein determining the type of anomalous condition comprises comparing sensor data with previously recorded data for the operating environment, and wherein the smart infrastructure component performs an action based upon the type of anomalous condition in order to modify the anomalous condition into an altered roadway condition; and
transmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the altered roadway condition, to allow the nearby vehicle to avoid or approach the altered roadway condition.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the nearby vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1 further comprising updating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the vehicle operating environment.
| 9. The computer-implemented method of claim 1, wherein the smart infrastructure component comprises a smart traffic light.
| 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 11. A computer system configured to generate a vehicle traffic alert, the computer system comprising one or more processors, the one or more processors configured to:
detect that an abnormal traffic condition exists in a vehicle operating environment;
generate an electronic message regarding the abnormal traffic condition;
transmit the electronic message to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition, the abnormal traffic condition having already occurred in the vehicle operating environment, wherein the type of anomalous condition is selected from at least one of a set of transient conditions or non-transient conditions, and wherein determining the type of anomalous condition comprises comparing sensor data with previously recorded data for the operating environment, and wherein the smart infrastructure component performs an action based upon the type of anomalous condition in order to modify the anomalous condition into an altered roadway condition; and
transmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the altered roadway condition, to allow the nearby vehicle to avoid or approach the altered roadway condition.
| 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the nearby vehicle.
| 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.
| 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 15. The computer system of claim 11, the system further configured to update a vehicle-usage profile associated with the nearby vehicle based upon received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 20. The computer system of claim 11, wherein the nearby vehicle travels to the vehicle operating environment. | The method involves detecting that an abnormal traffic condition exists in a vehicle operating environment, and generating an electronic message regarding the condition. The message is transmitted to a smart infrastructure component within a proximity of the environment. The component analyzes the message to determine a type of anomalous condition for the condition, where the component performs an action based upon the type of condition to modify the condition into an altered roadway condition, and transmits the message via wireless communication to alert a nearby vehicle of the altered condition to allow the vehicle to avoid or approach the roadway condition. An INDEPENDENT CLAIM is included for a computer system configured to generate a vehicle traffic alert. Computer-implemented method for generating a vehicle traffic alert. The data collected may be used to generate vehicle-usage profiles that more accurately reflect vehicle risk, or lack thereof, and facilitate more appropriate auto insurance pricing. The electronic message may then be transmitted through the vehicle's transceiver using a wireless communication to the nearby vehicle to alert the nearby vehicles of the abnormal traffic condition and to allow the neighboring vehicles to avoid the abnormally occurring traffic condition. The drawing shows a schematic diagram of a telematics collection system. |
Please summarize the input | Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.What is claimed is:
| 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising:
generating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated;
receiving, by one or more processors, effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology;
determining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested;
determining, by one or more processors, an accident risk model based upon the received effectiveness information and the determined indication of reliability;
determining, by one or more processors, an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and
causing, by one or more processors, information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer.
| 2. The computer-implemented method of claim 1, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.
| 3. The computer-implemented method of claim 1, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.
| 4. The computer-implemented method of claim 1, further comprising:
storing, by a non-transient computer-readable medium, the accident risk model;
receiving, by one or more processors, a request to determine the insurance policy for the vehicle; and
accessing, by one or more processors, the accident risk model based upon the received request.
| 5. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.
| 6. The computer-implemented method of claim 1, wherein determining the insurance policy includes calculating at least one of the following based upon the autonomous or semi-autonomous vehicle technology and the accident risk model: an automobile insurance premium, a discount, or a reward.
| 7. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following:
driver alertness monitoring;
driver responsiveness monitoring;
pedestrian detection;
artificial intelligence;
a back-up system;
a navigation system;
a positioning system;
a security system;
an anti-hacking measure;
a theft prevention system; or
remote vehicle location determination.
| 8. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following:
a point of impact;
a type of road;
a time of day;
a weather condition;
a type of a trip;
a length of a trip;
a vehicle style;
a vehicle-to-vehicle communication; or
a vehicle-to-infrastructure communication.
| 9. The computer-implemented method of claim 1, wherein causing information regarding all or a portion of the determined insurance policy for the vehicle to be presented to the customer includes causing to be presented on the display a cost of automobile insurance coverage.
| 10. The computer-implemented method of claim 1, wherein determining the accident risk model includes determining at least one risk level associated with the autonomous or semi-autonomous vehicle technology based upon observed responses of the autonomous or semi-autonomous vehicle technology in other vehicles.
| 11. The computer-implemented method of claim 1, wherein determining the insurance policy for the vehicle includes at least one of generating a new insurance policy associated with the vehicle or updating an existing insurance policy associated with the vehicle.
| 12. The computer-implemented method of claim 1, wherein the accident risk model further accounts for an effect of one or more of the following on the effectiveness information: weather, road type, or vehicle type.
| 13. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising:
one or more processors;
one or more communication modules adapted to communicate data;
one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies to generate test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated, and wherein the test results are communicated to the one or more processors via the one or more communication modules; and
a program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to:
receive effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology;
determine an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested;
determine an accident risk model based upon the received effectiveness information and the determined indication of reliability;
determine an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and
cause, via the one or more communication modules, information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer.
| 14. The computer system of claim 13, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.
| 15. The computer system of claim 13, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.
| 16. The computer system of claim 13, wherein the executable instructions further cause the computer system to:
store the accident risk model;
receive, via the one or more communication modules, a request to determine the insurance policy for the vehicle; and
access the accident risk model based upon the received request.
| 17. A tangible, non-transitory computer-readable medium storing instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to:
generate, using one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated;
receive effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology;
determine an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested;
determine an accident risk model based upon the received effectiveness information and the determined indication of reliability;
determine an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and
cause information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer.
| 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.
| 19. The tangible, non-transitory computer-readable medium of claim 17, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.
| 20. The tangible, non-transitory computer-readable medium of claim 17, further comprising executable instructions further cause the computer system to:
store the accident risk model;
receive, via one or more communication modules, a request to determine the insurance policy for the vehicle; and
access the accident risk model based upon the received request. | The computer-based method involves generating test data regarding results of virtual testing. The effectiveness information regarding the autonomous or semi-autonomous vehicle technology is received. An indication of reliability of the autonomous or semi-autonomous vehicle technology is determined. An insurance policy is determined for a vehicle (108) equipped with the autonomous or semi-autonomous vehicle technology. The information regarding all or a portion of the determined insurance policy is caused for the vehicle to be presented to a customer for review. INDEPENDENT CLAIMS are included for the following:a computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology; anda tangible, non-transitory computer-readable medium storing instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology. Computer-based method of evaluating effectiveness of autonomous or semi-autonomous vehicle technology. The autonomous vehicle operation features either assist the vehicle operator to more safely or efficiently operate a vehicle or takes full control of vehicle operation under some or all circumstances. An automobile insurance premium is determined by evaluating how effectively the vehicle is able to avoid and mitigate crashes and the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of a computer network, a computer server, a mobile device and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation and insurance processes. 100Autonomous vehicle insurance system102Front end component104Back end component108Vehicle110Mobile device |
Please summarize the input | Fully autonomous vehicle insurance pricingMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is:
| 1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
receive sensor data associated with the vehicle;
determine a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;
determine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle;
adjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and
cause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and
adjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle, either by time or mileage, the vehicle is predicted to be operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged.
| 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and
adjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in each of the plurality of weather and road conditions indicated by the usage data received in the log of usage data.
| 4. The system of claim 1, wherein:
the risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test results include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and
the test results are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.
| 6. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device;
determine a premium associated with the vehicle insurance policy based upon the total risk level; and
present an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 7. The system of claim 1, wherein the log of usage data regarding the one or more autonomous operation features includes a version of autonomous operation feature control software that is currently installed on the vehicle or in an autonomous operation feature system mounted on the vehicle.
| 8. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive information regarding a type and version of the one or more autonomous operation features; and
update the total risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features.
| 9. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to:
receive telematics data generated or broadcast from other vehicles; and
generate and display alternate routes based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions.
| 10. A computer-implemented method for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, comprising:
determining, by one or more processors, a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
determining, by the one or more processors, a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;
determining, by the one or more processors, a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and
adjusting, via the one or more processors, the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and
causing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 11. The method of claim 10, the method comprising:
estimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged; and
adjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged indicated by the usage data received in the log of usage data.
| 12. The method of claim 10, the method comprising:
estimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and
adjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in each of the plurality of weather and road conditions for a given time period.
| 13. The method of claim 10, wherein:
the risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test results include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and
the test results are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 14. The method of claim 10, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.
| 15. The method of claim 10, further comprising:
receiving, at the one or more processors or the associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device;
determining, by one or more processors, a premium associated with the vehicle insurance policy based upon the total risk level; and
presenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 16. The method of claim 10, the method further comprising:
receiving, via the one or more processors or the associated transceiver, information regarding a type and version of the one or more autonomous operation features; and
updating the total risk level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features.
| 17. The method of claim 10, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises:
receiving, via one or more vehicle-mounted processors or associated transceiver, telematics data generated or broadcast from other vehicles; and
generating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions.
| 18. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
determine from analysis of the usage data received in the log of usage data a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determine from analysis of the usage data received in the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for a period of time;
determine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for the period of time to facilitate more accurate risk assessment and auto insurance pricing;
determine types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and
adjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and
cause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. | The computer system comprises one or more processors (162), transceivers coupled to processors, and program memories (160) coupled to the processors and storing executable instructions that cause the one or more processors to determine a risk profile associated with operation of the vehicle that includes multiple risk levels associated with operation of the vehicle under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged. A log of usage data regarding previous use of the one or more autonomous operation features of the vehicle is received by a vehicle operator during multiple weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, where the mobile device (110) generates the log of usage data from the data received from the on-board computer during a vehicle trip. The log of usage data includes timestamp indicating a beginning of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with one or more autonomous operation features when engaged, and where the log of usage data further includes current weather and road conditions during the vehicle trip, receive sensor data associated with the vehicle, determine multiple expected use levels of the vehicle during multiple weather and road operating conditions, where the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data. The total risk level associated with overall operation of the vehicle based a portion upon the determined risk profile, and expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions are determine. The determine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle, adjusting the total risk level associated with autonomous operation of the vehicle based in portion upon the types of sensors installed in the vehicle. An INDEPENDENT CLAIM is included for a computer-implemented method for monitoring usage of vehicle having autonomous operation features for controlling the vehicle, which involves:determining, by one or more processors, a risk profile associated with operation of the vehicle that includes multiple risk levels associated with operation of the vehicle under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;receiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during multiple weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, where the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and where the log of usage data further includes current weather and road conditions during the vehicle trip;determining, by the one or more processors, multiple expected use levels of the vehicle during multiple weather and road operating conditions, where the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;determining, by the one or more processors, a total risk level associated with overall operation of the vehicle based portion upon the determined risk profile, and determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions;determining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle;adjusting, via the one or more processors, total risk level associated with autonomous operation of the vehicle based portion upon the types of sensors installed in the vehicle; andcausing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level to adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. Computer system for monitoring usage of vehicle having one or more autonomous operation features for controlling vehicle. The computer system allows to monitor the driving experience and/or usage of the autonomous or semi-autonomous vehicle technology, small time frames, and/or periodically to provide feedback to the driver, insurance provider, and/or adjust insurance policies or premiums, determine the automobile insurance premium by evaluating how effectively the vehicle to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of a computer system 102Front End Components110Mobile device160Program memories162Processors165Address/data Bus |
Please summarize the input | Autonomous vehicle technology effectiveness determination for insurance pricingMethods and systems for determining the effectiveness of one or more autonomous (and/or semi-autonomous) operation features of a vehicle are provided. According to certain aspects, information regarding autonomous operation features of the vehicle may be used to determine an effectiveness metric indicative of the ability of each autonomous operation feature to avoid or mitigate accidents or other losses. The information may include operating data from the vehicle or other vehicles having similar autonomous operation features, test data, or loss data from other vehicles. The determined effectiveness metric may then be used to determine part or all of an insurance policy, which may be reviewed by an insured and updated based upon the effectiveness metric.What is claimed is:
| 1. A computer-implemented method for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the method comprising:
implementing, by a test computing system, the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
presenting, by the test computing system, virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data generating, by the test computing system, test responses of the plurality of autonomous or semi-autonomous vehicle technologies;
based upon the test responses, determining, by the test computing system, an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies;
receiving, by a computing system, an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and
updating, by the computing system, an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies.
| 2. The computer-implemented method of claim 1, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
determining the effectiveness metric includes determining an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 3. The computer-implemented method of claim 2, wherein determining the update to the effectiveness metric includes determining a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 4. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 5. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 6. The computer-implemented method of claim 1, wherein:
determining the effectiveness metric includes generating a plurality of effectiveness metrics associated with a plurality of vehicle types;
the indication of the vehicle includes an indication of a vehicle type of the vehicle; and
updating the insurance policy associated with the vehicle is further based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle.
| 7. The computer-implemented method of claim 1, wherein:
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.
| 8. A computer system for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the system comprising:
a test computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the test computing system to:
implement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
present virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and
based upon the test responses, determining an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies; and
a computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the computing system to:
receive an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and
update an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies.
| 9. The computer system of claim 8, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
the executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 10. The computer system of claim 9, wherein the executable instructions that cause the test computing system to determine the update to the effectiveness metric further cause the test computing system to determine a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 11. The computer system of claim 8, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 12. The computer system of claim 8, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 13. The computer system of claim 8, wherein:
the executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types;
the indication of the vehicle includes an indication of a vehicle type of the vehicle; and
the executable instructions that cause the computing system to update the insurance policy associated with the vehicle further cause the computing system to update the insurance policy based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle.
| 14. The computer system of claim 8, wherein:
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.
| 15. A non-transitory computer-readable medium storing instructions for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies that, when executed by at least one processor of a computer system, cause the computer system to:
implement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment;
present virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
in response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies;
based upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies;
receive an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and
update an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies.
| 16. The computer-readable medium of claim 15, wherein:
the plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and
the executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies.
| 17. The computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 18. The computer-readable medium of claim 15, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.
| 19. The computer-readable medium of claim 15,
the executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types;
the indication of the vehicle includes an indication of a vehicle type of the vehicle; and
the executable instructions that cause the computer system to update the insurance policy associated with the vehicle further cause the computer system to update the insurance policy based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle.
| 20. The computer-readable medium of claim 15,
the plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and
the effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. | The method involves generating an effectiveness metric associated with autonomous or semi-autonomous vehicle technologies based upon test responses by processors of a test computing system. An indication of a vehicle (108) including the autonomous or semi-autonomous vehicle technologies is received by the processors of a computing system. An insurance policy associated with the vehicle is updated based upon the effectiveness metric associated with the autonomous or semi-autonomous vehicle technologies by the processors of the computing system. INDEPENDENT CLAIMS are also included for the following:a computer system for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle to avoid or mitigate vehicle accidentsa tangible, non-transitory computer-readable medium comprising a set of instructions for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle to avoid or mitigate vehicle accidents. Method for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle i.e. automobile, to avoid or mitigate vehicle accidents. The method enables allows an insurance provider to adjust or update insurance policies, premiums, rates, discounts, and/or other insurance-related items based upon a smart equipment warning functionality that can alert drivers of a vehicle equipment or a vehicle safety equipment that can need replaced or repaired, and thus reducing collision risk. The method enables allows a vehicle operator to maximize effectiveness of an autonomous operation feature, maximize vehicle insurance coverage, and/or minimize vehicle insurance expense. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front-end components108Vehicle110Mobile device120Sensors130Network135Link155Controller164RAM |
Please summarize the input | Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.The invention claimed is:
| 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising:
presenting, by the one or more processors, virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within a virtual test environment;
generating, by the one or more processors, test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generating, by the one or more processors, an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receiving, at the one or more processors, actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and
adjusting, by the one or more processors, the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels.
| 2. The computer-implemented method of claim 1, the method further comprising:
identifying, by the one or more processors, a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and
generating or updating, by the one or more processors, an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model.
| 3. The computer-implemented method of claim 2, further comprising:
causing, by the one or more processors, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review.
| 4. The computer-implemented method of claim 1, wherein:
generating the test responses includes generating test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology.
| 5. The computer-implemented method of claim 4, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 6. The computer-implemented method of claim 1, wherein generating the accident risk model includes determining the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 7. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 8. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.
| 9. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising:
one or more processors;
one or more program memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to:
present virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment;
generate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generate an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and
adjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level.
| 10. The computer system of claim 9, wherein the executable instructions further cause the computer system to:
identify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and
generate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model.
| 11. The computer system of claim 9, wherein:
the executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology.
| 12. The computer system of claim 11, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 13. The computer system of claim 9, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 14. The computer system of claim 9, wherein the executable instructions further cause the computer system to:
communicate to a customer computing device, via a communication network, information regarding all or a portion of an insurance policy to be presented to a customer associated with the customer vehicle for review via a display of the customer computing device.
| 15. The computer system of claim 9, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 16. A tangible, non-transitory computer-readable medium storing executable instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to:
present virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment;
generate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generate an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and
adjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level.
| 17. The tangible, non-transitory computer-readable medium of claim 16, wherein:
the executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology.
| 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 19. The tangible, non-transitory computer-readable medium of claim 16, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 20. The tangible, non-transitory computer-readable medium of claim 16, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data. | The method involves generating test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data by the one or more processors (162). An accident risk model indicating one or more risk levels for vehicles (108) associated with the autonomous or semi-autonomous vehicle technology is generated based upon the test responses by the one or more processors. The actual accident data associated with accidents involving vehicles is generated using the autonomous or semi-autonomous vehicle technology in a non-test environment at the one or more processors, the actual accident data comprising data collected by a vehicle sensor. The accident risk model is adjusted based upon the actual accident data by adjusting one of the one or more risk levels by the one or more processors. INDEPENDENT CLAIMS are included for:A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology;Atangible, non-transitory computer-readable medium storing. Computer-implemented method for evaluating effectiveness of autonomous or semi-autonomous vehicle technology such as driverless operation, accident avoidance or collision warning systems. The autonomous vehicle operation features either assist the vehicle operator to more safely or efficiently operate a vehicle or may take full control of vehicle operation under some or all circumstances. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features is facilitated. The insurance premium for automobile insurance coverage or another cost associated with the insurance policy is presented through a display screen to a customer for review, acceptance, and/or approval. The drawing shows a block diagram of the computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes.108Vehicle 110Mobile device 114Client device 130Network 162Processor |
Please summarize the input | Fully autonomous vehicle insurance pricingMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is:
| 1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
receive sensor data associated with the vehicle;
determine a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;
determine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle; and
adjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors in the vehicle.
| 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and
adjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle, either by time or mileage, the vehicle is predicted to be operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged.
| 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and
adjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in each of the plurality of weather and road conditions indicated by the usage data received in the log of usage data.
| 4. The method of claim 1, wherein:
the risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and
the test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.
| 6. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
cause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, or present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level.
| 7. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device;
determine a premium associated with the vehicle insurance policy based upon the total risk level; and
present an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 8. The system of claim 1, wherein the log of usage data regarding the one or more autonomous operation features includes a version of autonomous operation feature control software that is currently installed on the vehicle or in the autonomous operation feature system mounted on the vehicle.
| 9. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive information regarding a type and version of the one or more autonomous operation features; and
update the total risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features.
| 10. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to:
receive telematics data generated or broadcast from other vehicles; and
generate and display alternate routes based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions.
| 11. A computer-implemented method for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, comprising:
determining, by one or more processors, a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
determining, by the one or more processors, a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;
determining, by the one or more processors, a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and
adjusting, via the one or more processors, the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle.
| 12. The computer-implemented method of claim 11, the method comprising:
estimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged; and
adjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged indicated by the usage data received in the log of usage data.
| 13. The computer-implemented method of claim 11, the method comprising:
estimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and
adjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in each of the plurality of weather and road conditions for a given time period.
| 14. The computer-implemented method of claim 11, wherein:
the risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and
the test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 15. The computer-implemented method of claim 11, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.
| 16. The computer-implemented method of claim 11, the method further comprising:
adjusting, via the one or more processors, an insurance policy associated with the vehicle.
| 17. The method of claim 11, further comprising:
receiving, at the one or more processors or the associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device;
determining, by one or more processors, a premium associated with the vehicle insurance policy based upon the total risk level; and
presenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 18. The method of claim 11, the method further comprising:
receiving, via the one or more processors or the associated transceiver, information regarding a type and version of the one or more autonomous operation features; and
updating the total risk level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features.
| 19. The method of claim 11, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises:
receiving, via one or more vehicle-mounted processors or associated transceiver, telematics data generated or broadcast from other vehicles; and
generating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions.
| 20. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip;
determine from analysis of the usage data received in the log of usage data a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions;
determine from analysis of the usage data received in the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for a period of time;
determine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for the period of time to facilitate more accurate risk assessment and auto insurance pricing;
determine types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and
adjust the total risk associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle. | The system has transceivers coupled to processors (162), and program memories (160) coupled to the processors and storing executable instructions that cause the processors to determine a total risk level associated with overall operation of a vehicle based upon a determined risk profile, and the determined expected use levels that indicate whether or not a vehicle operator is expected to engage or disengage autonomous operation features during each of weather and road operating conditions. The processors determine types of sensors installed in the vehicle based on the sensor data associated with the vehicle, and adjust the total risk levels associated with autonomous operation of the vehicle. An INDEPENDENT CLAIM is included for a method for monitoring usage of vehicle. Computer system for monitoring usage of a vehicle i.e. autonomous vehicle. Can also be used for a semi-autonomous vehicle and a driverless vehicle. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features can be facilitated. The driverless operation or accident avoidance can be achieved. The financial protection against physical damage and/or bodily injury resulting from traffic accidents and against liability can be provided. The drawing drawing shows the block diagram of an exemplary computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes.100Autonomous vehicle insurance system 104Back-end components 110Mobile device 130Network 140Server 146Database 160Program memory 162Processor 164RAM |
Please summarize the input | FULLY AUTONOMOUS VEHICLE INSURANCE PRICINGMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.|1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features, comprising one or more processors and one or more program memories storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle (i) under operating environment conditions with the one or more autonomous operation features engaged, and (ii) under operating environment conditions with the one or more autonomous operation features disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features by a vehicle operator during the operating environment conditions;
determine a plurality of expected use levels of the vehicle during the operating environment conditions, including whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions as determined from processor analysis of the log of usage data;
determine a risk level associated with operation of the vehicle based upon (a) the determined risk profile, and (b) the determined expected use levels; and
cause the one or more processors to automatically perform an action based upon the determined risk level, wherein the action includes one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display.
| 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle during the operating environment conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and
adjust the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle.
| 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
estimate future usage or operation of the vehicle, either by time or mileage, during the operating environment conditions; and
adjust the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in the operating environment conditions as indicated by the log of usage data.
| 4. The method of claim 1, wherein:
the risk profile associated with autonomous operation of the vehicle is based upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during the operating environment conditions; and
the test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in the operating environment conditions.
| 6. The system of claim 1, wherein receiving the log of usage data comprises receiving the log via wireless communication from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current operating environment conditions during the vehicle trip.
| 7. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication;
determine a premium associated with the vehicle insurance policy based upon the risk level; and
present an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 8. The system of claim 1, wherein the log of usage data includes a version of autonomous operation feature control software that is currently installed on the vehicle or in the autonomous operation feature system mounted on the vehicle.
| 9. The system of claim 1, wherein the executable instructions further cause the one or more processors to:
receive information regarding a type and version of the one or more autonomous operation features; and
update the risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features.
| 10. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to:
receive telematics data from other vehicles; and
generate and display alternate routes based upon the received telematics data.
| 11. A computer-implemented method for use in connection with a vehicle having one or more autonomous operation features, comprising:
determining, by one or more processors, a risk profile associated with operation of the vehicle (i) under operating environment conditions with the one or more autonomous operation features engaged, and (ii) under the operating environment conditions with the one or more autonomous operation features disengaged;
receiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features by a vehicle operator during the operating environment conditions;
determining, by the one or more processors, a plurality of expected use levels of the vehicle during the operating environment conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions as determined from processor analysis of the log of usage data;
determining, by the one or more processors, a risk level associated with operation of the vehicle based upon (a) the determined risk profile, and (b) the determined expected use levels; and
causing the one or more processors to automatically perform an action based upon the determined total risk level, wherein the actions include one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display.
| 12. The computer-implemented method of claim 11, the method comprising:
estimating future usage or operation of the vehicle during the operating environment conditions and with the one or more autonomous operation features engaged or disengaged; and
adjusting, via the one or more processors, the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in the operating environment conditions with the one or more autonomous operation features engaged or disengaged indicated by the log of usage data.
| 13. The computer-implemented method of claim 11, the method comprising:
estimating future usage or operation of the vehicle, either by time or mileage, during the operating environment conditions; and
adjusting, via the one or more processors, the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in the operating environment conditions.
| 14. The computer-implemented method of claim 11, wherein:
the risk profile associated with autonomous operation of the vehicle is based upon test result data generated from test units corresponding to the one or more autonomous operation features;
the test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during the operating environment conditions; and
the test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles.
| 15. The computer-implemented method of claim 11, wherein the risk profile associated with autonomous operation of the vehicle is based upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in the operating environment conditions.
| 16. The computer-implemented method of claim 11, wherein receiving the log of usage data regarding comprises receiving the log via wireless communication from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current operating environment conditions.
| 17. The method of claim 11, further comprising:
receiving, at the one or more processors or an associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication;
determining, by one or more processors, a premium associated with the vehicle insurance policy based upon the risk level; and
presenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle.
| 18. The method of claim 11, the method further comprising:
receiving, via the one or more processors or an associated transceiver, information regarding a type and version of the one or more autonomous operation features; and
updating the total level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features.
| 19. The method of claim 11, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises:
receiving, via one or more vehicle-mounted processors or associated transceiver, telematics data from other vehicles; and
generating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data.
| 20. A computer system for monitoring usage of a vehicle having one or more autonomous operation features, the system comprising one or more processors and one or more program memories storing executable instructions that cause the one or more processors to:
determine a risk profile associated with operation of the vehicle (i) under a plurality of operating environment conditions with the one or more autonomous operation features engaged, and (ii) under the operating environment conditions with the one or more autonomous operation features disengaged;
receive a log of usage data regarding previous use of the one or more autonomous operation features during the operating environment conditions;
determine from analysis of the log of usage data a plurality of expected use levels of the vehicle during the operating environment conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions;
determine from analysis of the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during the operating environment conditions for a period of time;
determine a risk level associated with operation of the vehicle based upon (a) the determined risk profile, (b) the determined expected use levels, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during the operating environment conditions for the period of time; and
cause the one or more processors to automatically perform an action based upon the determined total risk level, wherein the action includes one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display. | The computer system (100) has processors (162) and program memories (160) storing executable instructions that cause the processors to determine expected use levels of a vehicle (108) during operating environment conditions, including whether or not the vehicle operator is expected to engage or disengage the autonomous operation features during operating environment conditions as determined from processor analysis of log of usage data, and determine risk level associated with operation of the vehicle based upon determined risk profile, and determined expected use levels. The processors automatically perform an action based upon determined risk level. The action includes one or more of adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display. An INDEPENDENT CLAIM is included for a method for use in connection with a vehicle having one or more autonomous operation features. Computer system for monitoring usage of vehicle having autonomous operation features such as autonomous vehicle, semi-autonomous vehicle. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features can be facilitated. The driverless operation or accident avoidance can be achieved. The financial protection against physical damage and/or bodily injury resulting from traffic accidents and against liability can be provided. The drawing shows a block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Computer system 108Vehicle 110Mobile device 160Program memory 162Processor |
Please summarize the input | Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.What is claimed is:
| 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising:
implementing, by one or more processors, the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies;
presenting, by the one or more processors, virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
generating, by the one or more processors, test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generating, by the one or more processors, an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receiving, at the one or more processors, actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment;
adjusting, by the one or more processors, the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level;
identifying, by the one or more processors, a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and
generating or updating, by the one or more processors, an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model.
| 2. The computer-implemented method of claim 1, wherein:
generating the test responses includes generating test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and
the one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology.
| 3. The computer-implemented method of claim 2, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 4. The computer-implemented method of claim 1, wherein generating the accident risk model includes determining the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 5. The computer-implemented method of claim 1, further comprising:
causing, by the one or more processors, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review.
| 6. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 7. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.
| 8. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of the following: a road type, a time of day, or a weather condition.
| 9. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising:
one or more processors;
one or more program memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to:
implement the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies;
present virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
generate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generate an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment;
adjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level;
identify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and
generate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model.
| 10. The computer system of claim 9, wherein:
the executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and
the one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology.
| 11. The computer system of claim 10, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 12. The computer system of claim 9, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 13. The computer system of claim 9, wherein the executable instructions further cause the computer system to:
communicate to a customer computing device, via a communication network, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle for review via a display of the customer computing device.
| 14. The computer system of claim 9, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.
| 15. A tangible, non-transitory computer-readable medium storing executable instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to:
implement the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies;
present virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
generate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data;
generate an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses;
receive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment;
adjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level;
identify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and
generate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model.
| 16. The tangible, non-transitory computer-readable medium of claim 15, wherein:
the executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and
the one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology.
| 17. The tangible, non-transitory computer-readable medium of claim 16, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.
| 18. The tangible, non-transitory computer-readable medium of claim 15, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.
| 19. The tangible, non-transitory computer-readable medium of claim 15, further storing executable instructions that, when executed by at least one processor of the computer system, cause the computer system to:
cause information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review.
| 20. The tangible, non-transitory computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data. | The method involves receiving actual accident data associated with accidents involving vehicles (108) at processors (162) using autonomous or semi-autonomous vehicle technology in a non-test environment, where the autonomous or semi-autonomous vehicle technology includes a vehicle self-braking functionality or vehicle self-steering functionality. An accident risk model is adjusted by the processor based on the actual accident data by adjusting one of the risk levels of the accident risk level. A customer vehicle including the autonomous or semi-autonomous vehicle control technology is identified by the processor. An insurance policy associated with the customer vehicle is generated or updated by the processor based on the adjusted risk level of the adjusted accident risk model. INDEPENDENT CLAIMS are also included for the following:a computer systema tangible, non-transitory computer-readable medium comprising a set of instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology by a computer system during urban driving or motorway driving conditions. Method for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology e.g. vehicle self-braking functionality or vehicle self-steering functionality, by a computer system (all claimed) during urban driving or motorway driving conditions. The method enables assisting a vehicle operator to safely or efficiently operate the vehicle or to take full control vehicle operation by providing autonomous vehicle operation features. The method enables monitoring driving experience and/or usage of the autonomous or semi-autonomous vehicle technology and small time-frames in real-time and periodically providing feedback to a driver and an insurance provider and/or to adjust the insurance policies or premiums. The method enables determining vehicle insurance premium by effectively evaluating the vehicle to avoid and/or mitigate crashes and/or extent to which driver's control of the vehicle is enhanced or replaced by vehicle's software and artificial intelligence. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 102Front end components104Back-end components108Accidents involving vehicles114Client device162Processor |
Please summarize the input | VEHICULAR TRAFFIC ALERTS FOR AVOIDANCE OF ABNORMAL TRAFFIC CONDITIONSMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is:
| 1. A computer-implemented method of analyzing abnormal traffic conditions, the method comprising:
determining, via one or more processors, a risk level of an abnormal traffic condition detected in a vehicle operating environment;
transmitting data comprising the abnormal traffic condition to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level; and
transmitting, via the one or more processors, an electronic message to a nearby vehicle via wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of a vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1 further comprising updating, via the one or more processors, a risk averse profile associated with an operator of the nearby vehicle based upon whether the nearby vehicle was operated in a manner to avoid or approach the altered roadway condition.
| 5. The computer-implemented method of claim 4, wherein the smart infrastructure component comprises a smart traffic light.
| 6. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 7. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 8. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 9. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the vehicle operating environment.
| 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 11. A computer system configured to analyze abnormal traffic conditions, the computer system comprising one or more processors, the one or more processors configured to:
determine, via one or more processors, a risk level of an abnormal traffic condition detected in a vehicle operating environment;
transmit data comprising the abnormal traffic condition to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level; and
transmit, via the one or more processors, an electronic message to a nearby vehicle via wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition.
| 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of a vehicle.
| 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.
| 14. The computer system of claim 11, the system further configured to update, via the one or more processors, a risk averse profile associated with an operator of the nearby vehicle based upon whether the nearby vehicle was operated in a manner to avoid or approach the altered roadway condition.
| 15. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 20. The computer system of claim 11, wherein the nearby vehicle travels to the vehicle operating environment. | The computer-implemented method involves determining a risk level of an abnormal traffic condition detected in a vehicle operating environment through several processors. Data comprising the abnormal traffic condition is transmitted to a smart infrastructure component (208) within a proximity of the vehicle operating environment. The smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level. An electronic message is transmitted to a nearby vehicle (202a) through wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition through the processors. An INDEPENDENT CLAIM is included for a computer system configured to analyze abnormal traffic conditions. Computer-implemented method for analyzing abnormal traffic conditions such as an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or several other vehicles braking ahead of the vehicle. The data collected may be used to generate vehicle-usage profiles that more accurately reflect vehicle risk, or lack thereof, and facilitate more appropriate auto insurance pricing. The electronic message may then be transmitted through the vehicle's transceiver using a wireless communication to the nearby vehicle to alert the nearby vehicles of the abnormal traffic condition and to allow the neighboring vehicles to avoid the abnormally occurring traffic condition. The drawing shows a block diagram of the system that collects telematics and/or other data, and uses V2x wireless communication to broadcast the data collected to other vehicles, mobile devices, remote servers, and smart infrastructure. 200Notification system 201Network 202aNearby vehicle 203Direct rRadio link 208Smart infrastructure component |
Please summarize the input | Autonomous communication feature useMethods and systems for determining collision risk associated with operation of autonomous vehicles using autonomous communication are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including features associated with autonomous communication between vehicles or with infrastructure. This information may be used to determine collision risk levels for a plurality of features, which may be based upon test data regarding the features or actual collision data. Expected use levels and autonomous communication levels may further be determined and used with the collision risk levels to determine a total collision risk level associated with operation of the vehicle. The autonomous communication levels may indicate the types of communications, the levels of communication with other vehicles or infrastructure, or the frequency of autonomous communication.What is claimed is:
| 1. A computer-implemented method for determining collision risk of one or more autonomous operation features of a vehicle, comprising:
receiving, by an autonomous communication feature of the one or more autonomous operation features, autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle;
controlling, by an on-board computer of the vehicle and the one or more autonomous operation features, operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data;
communicating, from the on-board computer of the vehicle to one or more processors of a server via a communication network, information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data;
receiving, at the one or more processors of the server from the on-board computer of the vehicle via the communication network, the information regarding the one or more autonomous operation features of the vehicle;
determining, by the one or more processors of the server, a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features;
determining, by the one or more processors of the server, a plurality of expected use levels of the vehicle based upon entries in the log of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments;
determining, by the one or more processors of the server, a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability;
determining, by the one or more processors of the server, a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and
causing, by the one or more processors of the server, one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 2. The method of claim 1, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.
| 3. The method of claim 1, further comprising receiving, at one or more processors, information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.
| 4. The method of claim 3, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature.
| 5. The method of claim 1, wherein the information regarding the one or more autonomous operation features of the vehicle is based upon (i) test results for test units corresponding to the one or more autonomous operation features, which test results include responses of the test units to test inputs corresponding to test scenarios, and (ii) actual collision data associated with a plurality of other vehicles having at least one of the one or more autonomous operation features.
| 6. The method of claim 1, wherein the total collision risk level is determined without reference to factors relating to collision risk associated with a vehicle operator.
| 7. The method of claim 1, further comprising:
receiving, at one or more processors, information regarding a vehicle operator; and
determining, by one or more processors, an operator collision-risk profile associated with vehicle operation by the vehicle operator;
wherein the total collision risk level is determined, at least in part, based upon the operator collision-risk profile.
| 8. A computer system for determining collision risk of one or more autonomous operation features of a vehicle, comprising:
one or more processors;
an autonomous communication feature of the one or more autonomous operation features, configured to receive autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle;
an on-board computer within the vehicle, configured to control operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data;
one or more communication modules adapted to communicate data from the on-board computer to the one or more processors via a communication network; and
a program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to:
receive, via the communication network, information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data;
determine a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features;
determine a plurality of expected use levels of the vehicle based upon entries in the log of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments;
determine a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability;
determine a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and
cause one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 9. The computer system of claim 8, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.
| 10. The computer system of claim 8, wherein the executable instructions further cause the computer system to receive information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.
| 11. The computer system of claim 10, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature.
| 12. The computer system of claim 8, wherein the information regarding the one or more autonomous operation features of the vehicle is based upon (i) test results for test units corresponding to the one or more autonomous operation features, which test results include responses of the test units to test inputs corresponding to test scenarios, and (ii) actual collision data associated with a plurality of other vehicles having at least one of the one or more autonomous operation features.
| 13. The computer system of claim 8, wherein the total collision risk level is determined without reference to factors relating to collision risks associated with a vehicle operator.
| 14. The computer system of claim 8, wherein the executable instructions further cause the computer system to:
receive information regarding a vehicle operator; and
determine an operator collision-risk profile associated with vehicle operation by the vehicle operator;
wherein the total collision risk level is determined, at least in part, based upon the operator collision-risk profile.
| 15. A tangible, non-transitory computer-readable medium storing instructions for determining collision risk of one or more autonomous operation features of a vehicle, when executed by at least one processor of a computer system, cause the computer system to:
receive autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle by an autonomous communication feature of the one or more autonomous operation features;
control operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data by an on-board computer of the vehicle and the one or more autonomous operation features;
communicate information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data, from the on-board computer of the vehicle to a server via a communication network;
receive the information regarding the one or more autonomous operation features of the vehicle at the server from the on-board computer
determine a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features;
determine a plurality of expected use levels of the vehicle based upon entries in the loci of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments;
determine a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability;
determine a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and
cause one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 16. The tangible, non-transitory computer-readable medium of claim 15, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.
| 17. The tangible, non-transitory computer-readable medium of claim 15, further comprising executable instructions that, when executed by at least one processor of a computer system, cause the computer system to receive information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.
| 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature. | The method involves determining a total collision risk level associated with operation of a vehicle (108) based upon collision risk levels, expected use levels, and autonomous communication levels by processors (162) of a server (140). Following actions are caused to be performed based upon the determined total collision risk level by the processors of the server such that an insurance policy associated with the vehicle is adjusted, a coverage level associated with the insurance policy is determined, information regarding the determined total collision risk level is presented to a reviewer by a display of a reviewer computing device to verify the determined total collision risk level, or determination to a customer by a display of a customer computing device for review of adjustment to the insurance policy associated with the vehicle is presented. INDEPENDENT CLAIMS are also included for the following:a computer system for determining collision risk of autonomous operation features of a vehiclea tangible non-transitory computer-readable medium comprising a set of instructions for determining collision risk of autonomous operation features of a vehicle. Method for determining collision risk of autonomous operation features of a vehicle i.e. smart car. The method enables allowing near real-time uploads and downloads of information and periodic uploads and downloads of information. The method enables providing autonomous vehicle operation features to assist the vehicle operator to safely or efficiently operate the vehicle or take full control of vehicle operation under part or all circumstances. The method enables monitoring driving experience and/or usage of the autonomous or semi-autonomous vehicle technology in real time, small timeframes, and/or periodically to provide feedback to the driver, insurance provider, and/or adjust insurance policies or premiums. The method enables determining automobile insurance premium by effectively evaluating the vehicle to avoid and/or mitigate crashes and/or extent to which driver's control of the vehicle is enhanced or replaced by vehicle's software and artificial intelligence. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front-end components104Back-end components108Vehicle130Network140Server162Processors |
Please summarize the input | Accident fault determination for autonomous vehiclesMethods and systems for determining fault for an accident involving a vehicle having one or more autonomous (and/or semi-autonomous) operation features are provided. According to certain aspects, operating data from sensors within or near the vehicle may be used to determine fault for a vehicle accident, such as a collision. The operating data may include information regarding use of the features at the time of the accident and may further be used to determine an allocation of fault for the accident between a vehicle operator, the autonomous operation features, or a third party. The allocation of fault may be used to determine and/or adjust coverage levels for an insurance policy associated with the vehicle. The allocation of fault may further be used to adjust risk levels or profiles associated with the vehicle operator or with the autonomous operation features.What is claimed is:
| 1. A computer system for reconstructing a vehicle crash, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to:
receive vehicle operating data for a vehicle having one or more autonomous operation features for controlling the vehicle, the vehicle operating data being generated and transmitted by an on-board computer or mobile device using wireless communication or data transmission, wherein the vehicle operating data includes:
(i) sensor data from one or more vehicle-mounted sensors associated with the one or more autonomous operation features, the sensor data also indicating a configuration or setting of each autonomous operation feature before and during the vehicle crash; and
(ii) a recorded log of decisions made by the one or more autonomous operation features, and commands sent from the on-board computer to control components to operate the vehicle, before and during the vehicle crash;
receive an indication of an accident involving the vehicle, or otherwise determine that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data and (ii) recorded log received;
generate a crash reconstruction representing a sequence of events involved in the accident by automatically determining, for each of a plurality of times in the sequence of events, each of the following: (i) a location of the vehicle based upon the sensor data, (ii) a location of an obstruction involved in the accident based upon the sensor data, and (iii) a movement of the vehicle based upon the decisions in the recorded log;
determine an allocation of fault for the accident for each of the one or more autonomous operation features based at least in part upon the crash reconstruction; and
cause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 2. The computer system of claim 1, wherein the one or more processors analyze the vehicle operating data received to determine an extent of vehicle damage, or a cost to repair the damage or replace part or all of the vehicle, the vehicle operating data including video or image data.
| 3. The computer system of claim 1, wherein determining an allocation of fault for the accident for the one or more autonomous operation features further includes the one or more processors analyzing data generated by the vehicle-mounted sensors or cameras depicting a vehicle environment and data from the sensors regarding the response of the vehicle to its environment prior to, or during, the vehicle crash.
| 4. The computer system of claim 1, wherein determining an allocation of fault for the accident for the one or more autonomous operation features further includes the one or more processors analyzing wireless communications or data transmissions to and from the vehicle, including vehicle-to-vehicle or infrastructure-to-vehicle communications.
| 5. The computer system of claim 1, wherein the one or more vehicle-mounted sensors include one or more of a GPS (Global Positioning System) unit, a radar unit, a LIDAR unit, an ultrasonic sensor, an infrared sensor, a camera, an accelerometer, a tachometer, or a speedometer, and at least one sensor is configured to actively or passively scan a vehicle environment of the vehicle for obstacles, including other vehicles, buildings, and pedestrians.
| 6. The computer system of claim 1, wherein the one or more vehicle-mounted sensors include one or more of an ignition sensor, an odometer, a system clock, a speedometer, a tachometer, an accelerometer, a gyroscope, a compass, a geolocation or GPS unit, a camera, or a distance sensor.
| 7. The computer system of claim 1, the one or more processors further configured to:
determine an allocation of fault for the accident based, at least in part, upon whether or not the vehicle was being operated in accordance with optimal use levels for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon road, weather, or traffic conditions at the time of, or prior to, the vehicle crash, the optimal use levels being associated with a lowest risk of vehicle crash.
| 8. The computer system of claim 1, the one or more processors further configured to:
determine an optimal use level for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon current road, weather, or traffic conditions;
compare the optimal use level with a current actual use level for the variety of combinations of configurations and settings associated with the one or more autonomous operation features; and
if the optimal use level differs from the current actual use level, generate and transmit an electronic notification to the vehicle or vehicle operator's mobile device recommending that the optimal use level be used.
| 9. The computer system of claim 1, wherein determining the allocation of fault further includes determining, by a processor, a point of impact on the vehicle, or an indication of a state of one or more traffic signals before, or during, the vehicle crash.
| 10. The computer system of claim 1, the one or more processors further configured to receive data indicating engagement of at least one of the one or more autonomous operation features before the vehicle crash; and
determining the allocation of fault for the vehicle crash includes the one or more processors analyzing whether the autonomous operation feature failed to take appropriate control actions or whether control signals were ineffective in controlling the vehicle immediately prior to the vehicle crash.
| 11. The computer system of claim 1, the one or more processors further configured to receive data indicating engagement or disengagement of the one or more autonomous operation features before the vehicle crash; and
wherein determining the allocation of fault for the vehicle crash includes the one or more processors analyzing whether the vehicle had time to take action to avoid the accident but that action was not taken.
| 12. The computer system of claim 1, the one or more processors and transceivers further configured to receive data indicating engagement or disengagement of the one or more autonomous operation features before the vehicle crash; and
wherein determining the allocation of fault for the vehicle crash includes the one or more processors determining that autonomous operation of the vehicle prior to the vehicle crash was no longer feasible due to conditions in a vehicle environment of the vehicle.
| 13. The computer system of claim 1, the one or more processors and transceivers configured to receive data indicating engagement of the one or more autonomous operation features before the vehicle crash; and
wherein determining the allocation of fault for the vehicle crash includes the one or more processors determining whether the one or more autonomous operation features attempted to return control of the vehicle to the vehicle operator prior to the vehicle crash and whether or not an adequate period of time for transition was available prior to the vehicle crash.
| 14. The computer system of claim 1, wherein the vehicle operating data received via wireless communication further includes telematics data indicating vehicle operation before and during a vehicle crash, including vehicle speed, heading, acceleration, and braking; and
the one or more processors are configured to (1) determine that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data, (ii) recorded log received, and (iii) telematics data; and (2) determine an allocation of fault for the accident for the one or more autonomous operation features based at least in part upon the received (i) sensor data, (ii) recorded log received, and (iii) telematics data.
| 15. A computer-implemented method for reconstructing a vehicle crash, comprising:
receiving, via one or more processors or an associated transceiver, vehicle operating data for a vehicle having one or more autonomous operation features for controlling the vehicle, the vehicle operating data being generated and transmitted by an on-board computer or mobile device using wireless communication or data transmission, wherein the vehicle operating data includes:
(i) sensor data from one or more vehicle-mounted sensors associated with the one or more autonomous operation features, the sensor data also indicating a configuration or setting of each autonomous operation feature before and during the vehicle crash; and
(ii) a recorded log of decisions made by the one or more autonomous operation features, and commands sent from the on-board computer to control components to operate the vehicle, before and during the vehicle crash;
receiving, via the one or more processors or associated transceiver, an indication of an accident involving the vehicle, or otherwise determining, via the one or more processors, that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data, and (ii) recorded log received;
generating, by the one or more processors, a crash reconstruction representing a sequence of events involved in the accident by automatically determining, for each of a plurality of times in the sequence of events, each of the following: (i) a location of the vehicle based upon the sensor data, (ii) a location of an obstruction involved in the accident based upon the sensor data, and (iii) a movement of the vehicle based upon the decisions in the recorded log;
determining, by the one or more processors, an allocation of fault for each of the accident for the one or more autonomous operation features based at least in part upon the crash reconstruction; and
causing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.
| 16. The computer-implemented method of claim 15, wherein, the one or more processors analyze the vehicle operating data received to determine an extent of vehicle damage, or cost to repair the damage or replace part or all of the vehicle, the vehicle operating data including video or image data.
| 17. The computer-implemented method of claim 15, wherein determining, by the one or more processors, an allocation of fault for the accident for the one or more autonomous operation features further includes processor analysis of data generated by the vehicle-mounted sensors or cameras depicting a vehicle environment and data from the sensors regarding the response of the vehicle to its environment prior to, and during, the vehicle crash.
| 18. The computer-implemented method of claim 15, wherein determining, by the one or more processors, an allocation of fault for the accident for the one or more autonomous operation features further includes processor analysis of wireless communications or data transmissions to and from the vehicle, including vehicle-to-vehicle or infrastructure-to-vehicle communications.
| 19. The computer-implemented method of claim 15, the method comprising:
determining, by the one or more processors, an allocation of fault for the accident based, at least in part, upon whether or not the vehicle being operated in accordance with optimal use levels for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon road, weather, or traffic conditions at the time of, or prior to, the vehicle crash, the optimal use levels being associated with a lowest risk of vehicle crash.
| 20. The computer-implemented method of claim 15, the method comprising:
determining, by the one or more processors, an optimal use level for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon current road, weather, or traffic conditions;
comparing the optimal use level with a current actual use level for the variety of combinations of configurations and settings associated with the one or more autonomous operation features; and if the optimal use level differs from the current actual use level, generating and transmitting a notification to the vehicle or vehicle operator's mobile device recommending that the optimal use level be used. | The system has a processor for determining an allocation of fault for accident for set of autonomous operation features based on crash reconstruction. The processor causes set of following actions to be performed based on determined total risk level, adjusts an insurance policy associated with a vehicle, determines coverage level associated with insurance policy, presents information regarding determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level or present determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. An INDEPENDENT CLAIM is also included for a method for reconstructing a vehicle crash. Computer system for reconstructing a vehicle e.g. autonomous vehicle crash. The system reduces risks associated with vehicle operation to control a vehicle to a vehicle operator and utilizes server to allocate fault for accident to set of autonomous operation features and adjusts risk levels and/or risk profiles associated with set of autonomous operation features at block. The system increases autonomous operation feature performance by facilitating near real-time uploads and downloads of information. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front end component104Back-end component110Mobile device114Communication component |
Please summarize the input | Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is:
| 1. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising:
detecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle;
generating, via the one or more processors, an electronic message regarding abnormal traffic condition;
transmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
updating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.
| 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component:
analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and
performs an action based upon the type of anomalous condition in order to modify the anomalous condition.
| 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 11. A computer system configured to generate a vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to:
detect that an abnormal traffic condition exists in an operating environment of a first vehicle;
generate an electronic message regarding the abnormal traffic condition;
transmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
update a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle.
| 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.
| 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 15. The computer system of claim 11, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the first vehicle. | The method involves detecting (1104) that an abnormal traffic condition exists in an operating environment of a first vehicle through a processor. An electronic message is generated (1106) regarding abnormal traffic condition. The electronic message transmitted to a nearby vehicle. The electronic message is transmitted (1108) through wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The telematics data regarding operation of the nearby vehicle is received after the nearby vehicle received the electronic message. A vehicle-usage profile associated with the nearby vehicle is updated based upon the received telematics data regarding operation of the nearby vehicle. An INDEPENDENT CLAIM is included for a system configured to generate a vehicle traffic alert and update a vehicle-usage profile. Method for generating vehicle traffic alert and updating vehicle-usage profile. The travel recommendations reduce or lower risk and enhance driver or vehicle safety. The insurance policies are adjusted, generated and updated. The drawing shows a flow diagram of a traffic condition broadcast method. 1100Method for traffic condition broadcast1102Step for collecting sensor data regarding a vehicle operating environment from sensors1104Step for detecting that an abnormal traffic condition exists in an operating environment1106Step for generating an electronic message regarding abnormal traffic condition1108Step for transmitting the electronic message through wireless communication to alert the nearby vehicle of the abnormal traffic condition |
Please summarize the input | Method and system for enhancing the functionality of a vehicleMethods and systems for enhancing the functionality of a semi-autonomous vehicle are described herein. The semi-autonomous vehicle may receive a communication from a fully autonomous vehicle within a threshold distance of the semi-autonomous vehicle. If the vehicles are travelling on the same route or the same portion of a route, the semi-autonomous vehicle may navigate to a location behind the fully autonomous vehicle. Then the semi-autonomous vehicle may operate autonomously by replicating one or more functions performed by the fully autonomous vehicle. The functions and/or maneuvers performed by the fully autonomous vehicle may be detected via sensors in the semi-autonomous vehicle and/or may be identified by communicating with the fully autonomous vehicle to receive indications of upcoming maneuvers. In this manner, the semi-autonomous vehicle may act as a fully autonomous vehicle.What is claimed is:
| 1. A computer-implemented method for enhancing the functionality of a vehicle, comprising:
broadcasting, via one or more processors and/or associated transceivers of a semiautonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication;
receiving, at the one or more processors and/or associated transceivers of the semiautonomous vehicle via vehicle-to-vehicle communication, an indication directly from several autonomous vehicles that each autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle;
selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle-based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and
for a portion of the route, causing, by the one or more processors, the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle.
| 2. The computer-implemented method of claim 1, wherein the one or more processors periodically re-verify that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.
| 3. The computer-implemented method of claim 1, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from a vehicle operator to operate.
| 4. The computer-implemented method of claim 3, wherein the semi-autonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.
| 5. The computer-implemented method of claim 1, wherein the semi-autonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.
| 6. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes:
receiving, at the one or more processors, an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and
causing, by the one or more processors, the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location.
| 7. The computer-implemented method of claim 6, further comprising:
receiving, at the one or more processors, an indication of a speed at which the selected autonomous vehicle is travelling; and
causing, by the one or more processors, the semi-autonomous vehicle to travel slower than the selected autonomous vehicle based upon the received speed.
| 8. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes:
detecting, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and
causing, by the one or more processors, the semi-autonomous vehicle to perform a same maneuver as the detected maneuver.
| 9. The computer-implemented method of claim 1, wherein a vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the autonomous vehicle; and
when the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the method further includes causing, by the one or more processors, the semi-autonomous vehicle to operate without input from a vehicle operator.
| 10. The computer-implemented method of claim 1, wherein selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.
| 11. A computer system configured to enhance the functionality of a vehicle, the computer system comprising one or more local or remote processors, transceivers, and/or sensors configured to:
broadcast, via a semi-autonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication;
receive, at the semi-autonomous vehicle via vehicle-to-vehicle communication, an indication directly from several fully autonomous or fully operational autonomous vehicles that each fully autonomous or fully operational autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle;
select, at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and
for a portion of the route, cause the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the selected autonomous vehicle.
| 12. The computer system of claim 11, wherein the semiautonomous vehicle periodically re-verifies that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.
| 13. The computer system of claim 11, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from a vehicle operator to operate.
| 14. The computer system of claim 13, wherein the semiautonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.
| 15. The computer system of claim 11, wherein the semiautonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.
| 16. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to:
receive an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and
cause the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location.
| 17. The computer system of claim 16, wherein one or more local or remote processors, transceivers, and/or sensors are further configured to:
receive an indication of a speed at which the selected autonomous vehicle is travelling; and
cause the semi-autonomous vehicle to travel slower than the selected autonomous vehicle based upon the received speed.
| 18. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to:
detect, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and
cause the semi-autonomous vehicle to perform a same maneuver as the detected maneuver.
| 19. The computer system of claim 11, wherein a vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the selected autonomous vehicle; and
when the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to cause the semi-autonomous vehicle to operate without input from a vehicle operator.
| 20. The computer system of claim 11, wherein selecting at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively. | The method involves broadcasting a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle (108) through vehicle-to-vehicle wireless communication through processors and/or associated transceivers of a semi-autonomous vehicle having autonomous operation features. An autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle-based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle is selected at the processors of the semi-autonomous vehicle and for a portion of the route. The semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle is caused by the processors. An INDEPENDENT CLAIM is included for a computer system configured to enhance the functionality of a vehicle. Computer based method for enhancing functionality of vehicle by caravanning with fully autonomous vehicles. The data application facilitates data communication between the front-end components and the back-end components are more efficient processing and data storage. The automobile insurance premium may be determined by evaluating how effectively the vehicle may be able to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of an autonomous vehicle data system for autonomous vehicle operation, monitoring, communication and related functions.100Autonomous vehicle data system 108Semi-autonomous vehicle 110Mobile devices 120Sensors 130Network |
Please summarize the input | Autonomous vehicle insurance pricing and offering based upon accident riskMethods and systems for monitoring use, determining risk, and pricing insurance policies for an autonomous vehicle having one or more autonomous operation features are provided. According to certain aspects, accident risk factors may be determined for autonomous operation features of the vehicle using information regarding the autonomous operation features of the vehicle or other accident related factors associated with the vehicle. The accident risk factors may indicate the ability of the autonomous operation features to avoid accidents during operation, particularly without vehicle operator intervention. The accident risk levels determined for a vehicle may further be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is:
| 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising:
generating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test results for the autonomous or semi-autonomous vehicle technology, wherein the computing systems generate the test results as hardware or software responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
receiving, at one or more processors, information regarding the test results;
determining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous vehicle technology based upon the test results, including compatibility of the autonomous or semi-autonomous vehicle technology with the at least one additional autonomous or semi-autonomous vehicle technologies tested;
determining, by one or more processors, an accident risk factor based upon the received information regarding the test results and the indication of reliability by analyzing an effect on a risk associated with a potential vehicle accident of the autonomous or semi-autonomous vehicle technology, wherein the accident risk factor is determined based upon an ability of a version of artificial intelligence of the autonomous or semi-autonomous vehicle technology to avoid collisions without human interaction;
determining, by one or more processors, one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the determined accident risk factor; and
causing, by one or more processors, information regarding the one or more vehicle insurance policies to be presented to one or more customers for review.
| 2. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology includes at least one of a fully autonomous vehicle feature or a limited human driver control feature.
| 3. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology performs at least one of the following functions:
steering;
accelerating;
braking;
monitoring blind spots;
presenting a collision warning;
adaptive cruise control; or
parking.
| 4. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following:
driver alertness monitoring;
driver responsiveness monitoring;
pedestrian detection;
artificial intelligence;
a back-up system;
a navigation system;
a positioning system;
a security system;
an anti-hacking measure;
a theft prevention system; or
remote vehicle location determination.
| 5. The computer-implemented method of claim 1, further comprising receiving, at one or more processors, an accident-related factor, wherein:
the accident risk factor is further determined based in part upon the received accident-related factor, and
the accident-related factor is related to at least one of the following:
a point of impact;
a type of road;
a time of day;
a weather condition;
a type of a trip;
a length of a trip;
a vehicle style;
a vehicle-to-vehicle communication; or
a vehicle-to-infrastructure communication.
| 6. The computer-implemented method of claim 1, wherein the accident risk factor is further determined for the autonomous or semi-autonomous vehicle technology based upon at least one of the following: (1) a type of the autonomous or semi-autonomous vehicle technology, (2) a version of computer instructions of the autonomous or semi-autonomous vehicle technology, (3) an update to computer instructions of the autonomous or semi-autonomous vehicle technology, or (4) an update to the artificial intelligence associated with the autonomous or semi-autonomous vehicle technology.
| 7. The computer-implemented method of claim 1, wherein the method further includes determining at least one of a discount, a refund, or a reward associated with the one or more vehicle insurance policies based upon the accident risk factor determined for the autonomous or semi-autonomous vehicle technology.
| 8. The computer-implemented method of claim 1, wherein the received information further includes at least one of a database or a model of accident risk assessment based upon information regarding past vehicle accident information.
| 9. The computer-implemented method of claim 1, wherein causing information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review includes communicating to each customer an insurance premium for automobile insurance coverage.
| 10. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising:
one or more processors;
one or more communication modules adapted to communicate data;
one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies to generate test results for the autonomous or semi-autonomous vehicle technology, wherein the computing systems generate the test results as hardware or software responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment, and wherein the test results are communicated to the one or more processors via the one or more communication modules; and
a program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to:
receive information regarding the test results;
determine an indication of reliability of the autonomous or semi-autonomous vehicle technology based upon the test results, including compatibility of the autonomous or semi-autonomous vehicle technology with the at least one additional autonomous or semi-autonomous vehicle technologies tested;
determine an accident risk factor based upon the received information regarding the test results and the indication of reliability by analyzing an effect on a risk associated with a potential vehicle accident of the autonomous or semi-autonomous vehicle technology, wherein the accident risk factor is determined based upon an ability of a version of artificial intelligence of the autonomous or semi-autonomous vehicle technology to avoid collisions without human interaction;
determine one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the determined accident risk factor;
and cause information regarding the one or more vehicle insurance policies to be presented to one or more customers for review.
| 11. The computer system of claim 10, wherein the accident risk factor is further determined for the autonomous or semi-autonomous vehicle technology based upon at least one of the following: (1) a type of the autonomous or semi-autonomous vehicle technology, (2) a version of computer instructions of the autonomous or semi-autonomous vehicle technology, (3) an update to computer instructions of the autonomous or semi-autonomous vehicle technology, or (4) an update to the artificial intelligence associated with the autonomous or semi-autonomous vehicle technology.
| 12. The computer system of claim 10, wherein the received information further includes at least one of a database or a model of accident risk assessment based upon information regarding past vehicle accident information.
| 13. The computer system of claim 10, wherein the autonomous or semi-autonomous vehicle technology includes at least one of a fully autonomous vehicle feature or a limited human driver control feature.
| 14. The computer system of claim 10, wherein the executable instructions that cause the computer system to cause information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review include instructions that cause the computer system to communicate to each customer an insurance premium for automobile insurance coverage.
| 15. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous driving package of computer instructions, the method comprising:
generating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous driving package operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous driving packages of computer instructions, test results for the autonomous or semi-autonomous driving package of computer instructions in the virtual test environment, wherein the computing systems generate the test results as responses of the computer instructions implemented within the virtual test environment to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment;
determining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous driving package based upon the test results, including compatibility of the autonomous or semi-autonomous driving package with the at least one additional autonomous or semi-autonomous driving packages tested;
analyzing, by one or more processors, loss experience associated with the computer instructions to determine effectiveness in actual driving situations;
determining, by one or more processors, a relative accident risk factor for artificial intelligence of the computer instructions based upon the ability of the computer instructions to make automated or semi-automated driving decisions for a vehicle that avoid collisions using the test results, the indication of reliability, and analysis of loss experience;
determining, by one or more processors, one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the relative risk factor assigned to the artificial intelligence of the autonomous or semi-autonomous driving package of computer instructions; and
causing, by one or more processors, information regarding the one or more vehicle insurance policies to be presented to one or more customers for review.
| 16. The computer-implemented method of claim 15, wherein the autonomous or semi-autonomous driving package of computer instructions are stored on a non-transitory computer readable medium and direct autonomous or semi-autonomous vehicle functionality related to at least one of the following functions:
steering;
accelerating;
braking;
monitoring blind spots;
presenting a collision warning;
adaptive cruise control; or
parking.
| 17. The computer-implemented method of claim 15, wherein the autonomous or semi-autonomous driving package of computer instructions are stored on a non-transitory computer readable medium and direct autonomous or semi-autonomous vehicle functionality related to at least one of the following:
driver alertness monitoring;
driver responsiveness monitoring;
pedestrian detection;
artificial intelligence;
a back-up system;
a navigation system;
a positioning system;
a security system;
an anti-hacking measure;
a theft prevention system; or
remote vehicle location determination.
| 18. The computer-implemented method of claim 15, wherein the relative accident factor is based upon, at least in part, at least one accident-related factor, including:
a point of impact;
a type of road;
a time of day;
a weather condition;
a type of a trip;
a length of a trip;
a vehicle style;
a vehicle-to-vehicle communication; or
a vehicle-to-infrastructure communication.
| 19. The computer-implemented method of claim 15, the method further comprising adjusting at least one of an insurance premium, a discount, a refund, or a reward associated with the one or more vehicle insurance policies based upon the relative accident risk factor.
| 20. The computer-implemented method of claim 15, wherein causing information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review by the one or more customers includes communicating to each customer a cost of automobile insurance coverage. | The method involves generating test results for autonomous or semi-autonomous vehicle technology. The information regarding test results are received (1004). An indication of reliability of autonomous or semi-autonomous vehicle technology is determined (1010). An accident risk factor is determined (1012) based upon received information regarding test results and indication of reliability. The vehicle insurance policy premiums for vehicles are determined (1014). The information regarding vehicle insurance policies are presented to the customers for review. INDEPENDENT CLAIMS are included for the following:a computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology; anda computer-based method of evaluating effectiveness of an autonomous or semi-autonomous driving package of computer instructions. Computer-based method of evaluating effectiveness of autonomous or semi-autonomous vehicle technology. An automobile insurance premium is determined by evaluating how effectively the vehicle is able to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The autonomous vehicle operation features assist the vehicle operator to more safely or efficiently operate a vehicle or take full control of vehicle operation under some or all circumstances. The autonomous or semi-autonomous vehicle technology and/or the autonomous or semi-autonomous driving package of computer instructions can perform the following functions such as steering, accelerating, braking, monitoring blind spots, presenting a collision warning, adaptive cruise control, and/or parking and relate to the following driver alertness monitoring, driver responsiveness monitoring, pedestrian detection, artificial intelligence, a back-up system, a navigation system, a positioning system, a security system, an anti-hacking measure, a theft prevention system, and/or remote vehicle location determination. The drawing shows a flow diagram depicting an autonomous vehicle insurance pricing method for determining risk and premiums for vehicle insurance policies covering autonomous vehicles with autonomous communication features. 1004Step for receiving information regarding test results1006Step for determining risk levels associated with autonomous operation1010Step for determining indication of reliability of autonomous or semi-autonomous vehicle technology1012Step for determining accident risk factor1014Step for determining vehicle insurance policy premiums |
Please summarize the input | Method and system for enhancing the functionality of a vehicleMethods and systems for enhancing the functionality of a semi-autonomous vehicle are described herein. The semi-autonomous vehicle may receive a communication from a fully autonomous vehicle within a threshold distance of the semi-autonomous vehicle. If the vehicles are travelling on the same route or the same portion of a route, the semi-autonomous vehicle may navigate to a location behind the fully autonomous vehicle. Then the semi-autonomous vehicle may operate autonomously by replicating one or more functions performed by the fully autonomous vehicle. The functions and/or maneuvers performed by the fully autonomous vehicle may be detected via sensors in the semi-autonomous vehicle and/or may be identified by communicating with the fully autonomous vehicle to receive indications of upcoming maneuvers. In this manner, the semi-autonomous vehicle may act as a fully autonomous vehicle.What is claimed is:
| 1. A computer-implemented method for enhancing the functionality of a vehicle, comprising:
broadcasting, via one or more processors and/or associated transceivers of a semiautonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication when the semi-autonomous vehicle is operating in a partially autonomous mode of operation with at least some of the control decisions being made by a vehicle operator;
receiving, at the one or more processors and/or associated transceivers of the semiautonomous vehicle via vehicle-to-vehicle communication, an indication directly from several autonomous vehicles that each autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle;
selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and
for a portion of the route, causing, by the one or more processors, the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle, such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator along the same portion of the route.
| 2. The computer-implemented method of claim 1, wherein the one or more processors periodically re-verify that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.
| 3. The computer-implemented method of claim 1, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from the vehicle operator to operate.
| 4. The computer-implemented method of claim 3, wherein the semi-autonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.
| 5. The computer-implemented method of claim 1, wherein the semi-autonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.
| 6. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes:
receiving, at the one or more processors, an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and
causing, by the one or more processors, the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location.
| 7. The computer-implemented method of claim 6, further comprising:
receiving, at the one or more processors, an indication of a speed at which the selected autonomous vehicle is travelling; and
causing, by the one or more processors, the semi-autonomous vehicle to travel slower than the semi-autonomous vehicle based upon the received speed.
| 8. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes:
detecting, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and
causing, by the one or more processors, the semi-autonomous vehicle to perform a same maneuver as the detected maneuver.
| 9. The computer-implemented method of claim 1, wherein the vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the autonomous vehicle; and
when the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the method further includes causing, by the one or more processors, the semi-autonomous vehicle to operate without input from the vehicle operator.
| 10. The computer-implemented method of claim 1, wherein selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.
| 11. A computer system configured to enhance the functionality of a vehicle, the computer system comprising one or more local or remote processors, transceivers, and/or sensors configured to:
broadcast, via a semi-autonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication when the semi-autonomous vehicle is operating in a partially autonomous mode of operation with at least some of the control decisions being made by a vehicle operator;
receive, at the semi-autonomous vehicle via vehicle-to-vehicle communication, an indication directly from several fully autonomous or fully operational autonomous vehicles that each fully autonomous or fully operational autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle;
select, at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and
for a portion of the route, cause the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the selected autonomous vehicle, such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator.
| 12. The computer system of claim 11, wherein the semiautonomous vehicle periodically re-verifies that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.
| 13. The computer system of claim 11, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from the vehicle operator to operate.
| 14. The computer system of claim 13, wherein the semiautonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.
| 15. The computer system of claim 11, wherein the semiautonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.
| 16. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to:
receive an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and
cause the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location.
| 17. The computer system of claim 16, wherein one or more local or remote processors, transceivers, and/or sensors are further configured to:
receive an indication of a speed at which the selected autonomous vehicle is travelling; and
cause the semi-autonomous vehicle to travel slower than the semi-autonomous vehicle based upon the received speed.
| 18. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to:
detect, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and
cause the semi-autonomous vehicle to perform a same maneuver as the detected maneuver.
| 19. The computer system of claim 11, wherein the vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the selected autonomous vehicle; and
when the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to cause the semi-autonomous vehicle to operate without input from the vehicle operator.
| 20. The computer system of claim 11, wherein selecting at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively. | The method involves broadcasting, via a processor and/or associated transceivers of a semi-autonomous vehicle, a request to follow a fully autonomous vehicle (502) within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication. An indication is received directly from a number of autonomous vehicles that each autonomous vehicle is within the predetermined communication range (504). An autonomous vehicle is selected from among the autonomous vehicles within the predetermined communication range based upon a safety rating of each of the autonomous vehicles as determined according to identification information for each autonomous vehicle. The semi-autonomous vehicle is caused to follow the selected autonomous vehicle (510) and mimic each maneuver performed by the autonomous vehicle (512), such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator along the same portion of the route. An INDEPENDENT CLAIM is also included for a computer system configured to enhance the functionality of a vehicle. Method for enhancing functionality of semi-autonomous vehicle. The method enables the fully autonomous vehicle to act as a guide to ensure the semi-autonomous vehicle is safe to make a particular maneuver, when the semi-autonomous vehicle does not have the sensor capabilities to detect and/or monitor all of its surroundings. The drawing shows the flow diagram of an autonomous vehicle caravan method for causing a semi-autonomous vehicle to follow a follow autonomous vehicle. 502Broadcast request to follow fully autonomous vehicle504Receive communication from autonomous vehicle within predetermined communication range506Compare route for fully autonomous vehicle to route for semi-autonomous vehicle510Cause semi-autonomous vehicle to follow selected autonomous vehicle512Mimic each maneuver performed by autonomous vehicle |
Please summarize the input | Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is:
| 1. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising:
detecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle;
generating, via the one or more processors, an electronic message regarding the abnormal traffic condition;
transmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
updating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 5. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 6. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 7. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.
| 8. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component:
analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and
performs an action based upon the type of anomalous condition in order to modify the anomalous condition.
| 9. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 10. A computer system configured to generate a vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to:
detect that an abnormal traffic condition exists in an operating environment of a first vehicle;
generate an electronic message regarding the abnormal traffic condition;
transmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition;
receive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and
update a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.
| 11. The computer system of claim 10, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.
| 12. The computer system of claim 10, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.
| 13. The computer system of claim 10, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 14. The computer system of claim 10, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 15. The computer system of claim 10, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 16. The computer system of claim 10, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 17. The computer system of claim 10, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 18. The computer system of claim 10, wherein the nearby vehicle travels to the operating environment of the first vehicle.
| 19. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising:
detecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle;
generating, via the one or more processors, an electronic message regarding the abnormal traffic condition;
transmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and
receiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message,
wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle. | The method involves detecting that an abnormal traffic condition exists in an operating environment of a first vehicle through multiple processors. An electronic message regarding the abnormal traffic condition is generated. The electronic message is transmitted to a nearby vehicle, where the electronic message is transmitted through wireless communication to alert a nearby vehicle (108) of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The telematics data regarding operation of the nearby vehicle is received after the nearby vehicle received the electronic message. A vehicle-usage profile is updated associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, where updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle. An INDEPENDENT CLAIM is included for a computer system for generating vehicle traffic alert and update vehicle-usage profile. Method for generating vehicle traffic alert and updating usage profile of vehicle such as slow-moving vehicle e.g. farm machinery, construction equipment, oversized load vehicle, or emergency vehicle e.g. ambulance, fire truck, police vehicle equipped to transmit electronic message indicate presence to nearby vehicle. The communication unit is configured to conditionally send data, which is particularly advantageous when computing device is implemented as a mobile computing device, as such conditions helps to reduce power usage and prolong battery life. The second computing device ignores the telematics data, thus saving processing power and battery life. The external computing device updates the earlier profile based upon new telematics data, which updates occur periodically or upon occurrence of an event. The drawing shows a block diagram of the telematics collection system. 100Telematics collection system106External computing device108Nearby vehicle110Computing device120Tactile alert system |
Please summarize the input | Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert. Various aspects may include detecting that an abnormal traffic condition exists in an operating environment of a vehicle and generating a related electronic message. The electronic message may be transmitted via the vehicle's transceiver using a wireless communication to a nearby vehicle to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition.What is claimed is:
| 1. A computer-implemented method of generating a vehicle-to-vehicle traffic alert, the method comprising:
detecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a vehicle;
generating, via the one or more processors, an electronic message regarding the abnormal traffic condition;
transmitting, via a vehicle-mounted transceiver associated with the vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and
updating a risk aversion profile associated with a vehicle operator of the nearby vehicle based upon the electronic message, wherein the risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including at least an environment where the nearby vehicle has traveled two or more times.
| 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the vehicle.
| 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 4. The computer-implemented method of claim 1, the method further comprising generating an auto insurance discount associated with the vehicle.
| 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.
| 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the vehicle.
| 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component:
analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and
performs an action based on the type of anomalous condition in order to modify the anomalous condition.
| 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and wherein the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.
| 11. A computer system configured to generate a vehicle-to-vehicle traffic alert, the computer system comprising one or more processors, the one or more processors configured to:
detect that an abnormal traffic condition exists in an operating environment of a vehicle;
generate an electronic message regarding the abnormal traffic condition;
transmit, via a vehicle-mounted transceiver associated with the vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and
update a risk aversion profile associated with a vehicle operator of the nearby vehicle based upon the electronic message, wherein the risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including at least an environment where the nearby vehicle has traveled two or more times.
| 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the vehicle.
| 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.
| 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.
| 15. The computer system of claim 11, the system further configured to generate, an auto insurance discount associated with the vehicle.
| 16. The computer system of claim 11, wherein the one or more processors is one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.
| 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.
| 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.
| 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.
| 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the vehicle. | The method involves transmitting the electronic message through the wireless communication to alert nearby vehicle (108) of abnormal traffic condition and to allow nearby vehicle to avoid abnormal traffic condition. A risk aversion profile associated with a vehicle operator (106) of the nearby vehicle is updated based upon the electronic message. The risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including an environment where the nearby vehicle has traveled several times. An INDEPENDENT CLAIM is included for a computer system configured to generate a vehicle-to-vehicle traffic alert. Computer based method for generating vehicle-to-vehicle traffic alert. The insurance policies such as vehicle or life insurance policies can be adjusted, generated, and/or updated, based upon an individual's usage and/or taking travel recommendations, such as travel recommendations that reduce or lower risk and/or enhance driver or vehicle safety. The risk can be reduced, by reducing road rage by reporting negative driving behavior. The risk averse customers can receive insurance discounts or other insurance cost savings based upon data that reflects low risk driving behavior and/or technology that mitigates or prevents risk to insured assets, such as vehicles or even homes, and/or vehicle operators or passengers. The drawing shows a block diagram of the telematics collection system. 106Vehicle operator108Vehicle114Board computer116Link122Speaker |
Please summarize the input | Shared control for vehicles travelling in formationMethods and apparatus for controlling two or more vehicles travelling in formation. Selected vehicles may be fully or partially autonomously controlled; at least one vehicle is partially controlled by a human driver. Information is collected at each vehicle and from the drivers and it is shared with other vehicles and drivers to create a shared world model. Aspects of the shared world model may be presented to the human driver, who may then respond with a control input. Autonomy systems and the drivers on the vehicles then collaborate to make a collective decision to act or not to act and execute any such action in a coordinated manner.The invention claimed is:
| 1. A method for collaborative control of a platoon of vehicles wherein a first vehicle is at least partially controllable by a human driver and a second vehicle is at least partially controllable by autonomy logic, comprising:
collecting information from human driver inputs on the first vehicle;
collecting information from sensors on both the first vehicle and the second vehicle;
sharing information thus collected between the first vehicle and the second vehicle to provide shared information;
each vehicle using the shared information to maintain a respective local copy of a shared world model, wherein maintaining the respective local copy of the shared world model comprises:
maintaining a local model, including by processing, by the first vehicle, at least a subset of the collected information from sensors on both the first vehicle and the second vehicle to derive perception information and situation information, wherein the derived perception information comprises one or more attributes of one or more sensed objects and wherein the derived situation information comprises information about one or more sensed events,
after deriving the derived perception information and the derived situation information, inputting the derived perception information and the derived situation information to a first local copy of the shared world model at the first vehicle,
receiving, by the second vehicle, the derived perception information and the derived situation information added to the first local copy of the shared world model, and
inputting the derived perception information and the derived situation information to a second local copy of the shared world model at the second vehicle;
collaboratively determining, by both the first and the second vehicle in direct vehicle to vehicle communication, to perform a proposed action based on the first and second local copies of the shared world model; and
in accordance with determining, by both the first and the second vehicle, to perform the proposed action, performing, by either or both the first and the second vehicle, the proposed action.
| 2. The method of claim 1 wherein the proposed action is to update a state of the world model.
| 3. The method of claim 1 wherein the proposed action is proposed by the autonomy logic and the proposed action may be vetoed by the human driver.
| 4. The method of claim 1 wherein the proposed action is proposed by the human driver and the proposed action may be vetoed by the autonomy logic.
| 5. The method of claim 1 wherein the proposed action includes either the first or second vehicle joining or leaving the platoon.
| 6. The method of claim 5 wherein the proposed action includes a third vehicle which is at least partially controlled by a human joining the platoon behind the first vehicle, and then the third vehicle entering an autonomous driving mode.
| 7. The method of claim 5 wherein the proposed action includes the first vehicle leaving an autonomous driving mode and entering a human-controlled mode and exiting the platoon.
| 8. The method of claim 1 wherein the proposed action includes swapping roles of a leader vehicle and a follower vehicle in the platoon.
| 9. The method of claim 1 wherein the proposed action includes
either the first or second vehicle changing lanes; or
either the first or second vehicle entering or leaving a travel lane; or
either the first or second vehicle increasing or decreasing speed or distance to another vehicle; or
either the first or second vehicle maneuvering to park next to another vehicle.
| 10. The method of claim 1 wherein the human driver inputs include information conveyed visually, via audio, or physically such as by forces on a joystick, or a steering device, or other input device.
| 11. The method of claim 1 wherein determining whether to perform the proposed action includes propagating, between the first vehicle and the second vehicle, constraints imposed on either the first vehicle or the second vehicle.
| 12. The method of claim 11 wherein the constraints include the autonomy logic discouraging but not preventing the human from making a steering decision.
| 13. The method of claim 1 wherein the shared information includes
data originating outside components of the autonomy logic or human control and derived data;
data originating inside the autonomy logic or human control; and/or
physical phenomena that is capable of being sensed by the human driver.
| 14. The method of claim 1, further comprising:
displaying at least a selected portion of the shared information on a display associated with the first vehicle. | The method involves collecting information from human driver inputs on the first vehicle. Information is collected from sensors (112) on both the first vehicle and the second vehicle. Information collected between the first vehicle and the second vehicle is shared to provide shared information. Each vehicle is enabled to collaboratively engage in a decision (124) with the other vehicle as a unit using a world model (126). A proposed action is proposed by a logic of autonomy (122), where the proposed action includes either the first or second vehicle joining or leaving the platoon. An INDEPENDENT CLAIM is included for an interface for enabling collaborative control of a platoon of vehicles. Method for realizing collaborative control of platoon of vehicles such as commercial vehicles e.g. long-haul truck. Can also be used in autonomous vehicles. The method enables improving operation of the platoon for the autonomy logic on one or both vehicles to collaborate with each other and with the human driver to improve operation of platoon. The method enables allowing a human to function as a resource for the robot, thus providing assistance with cognition and perception during task execution and enabling the human to compensate inadequacies of autonomy. The drawing shows a schematic diagram of an autonomous and a human-driven truck.112Sensors 114Actuators 122Autonomy 124Decision 126World model |
Please summarize the input | Systems and methods for simulating GNSS multipath and obscuration with networked autonomous vehiclesThe disclosed technology teaches testing an autonomous vehicle: shielding a GNSS receiving antenna of the vehicle from ambient GNSS signals while the vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals. Testing includes using a GNSS signal generating system: receiving the ambient GNSS signals using an antenna of the system and determining a location and acceleration of the vehicle from the GNSS signals, accessing a model of an augmented environment that includes multi-pathing and obscuration of the GNSS signals along a test path, based on the determined location—generating the simulated GNSS signals to feed to the vehicle, in real time—simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the determined location, and feeding the simulated signals to a receiver in the vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path.We claim as follows:
| 1. A method of testing an autonomous vehicle, including
shielding a Global Navigation Satellite System (abbreviated GNSS) receiving antenna of the autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals;
using a GNSS signal generating system,
receiving the ambient GNSS signals using an antenna of the GNSS signal generating system and determining a location and acceleration of the autonomous vehicle from the ambient GNSS signals;
accessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient GNSS signals along a test path, based on the location determined from the GNSS signals;
generating the simulated GNSS signals to feed to the autonomous vehicle, in real time, simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the location determined from the GNSS signals; and
feeding the simulated GNSS signals to a receiver in the autonomous vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path.
| 2. The method of claim 1, further including spoofing by substituting pirate signals for ambient GNSS as the autonomous vehicle travels along the test path.
| 3. The method of claim 1, further including wireless and conductive feeds of the simulated GNSS signals.
| 4. The method of claim 1, further including using a Faraday cage to shield intent of the autonomous vehicle.
| 5. The method of claim 1, further including coupling the received ambient GNSS signals with inertial measurements unit (abbreviated IMU) input to determine the position of the vehicle in real time with reduced latency.
| 6. The method of claim 1, further including operating the vehicle on a track and simulating buildings.
| 7. The method of claim 1, further including operating the vehicle in an urban environment and combining impaired GNSS signals with object sensors (visual, LIDAR, SONAR, RADAR) used by the vehicle for navigation.
| 8. The method of claim 1, further including operating the vehicle in an urban environment and combining impaired GNSS signals with vehicle to vehicle (abbreviated V2V) and vehicle to infrastructure (Abbreviated V2I) communications used by the vehicle for navigation.
| 9. A method of testing a connected vehicle that is connected to other vehicles and/or infrastructure, including:
shielding a cellular receiving antenna of the connected vehicle from ambient cellular signals while the connected vehicle is under test and supplanting the ambient cellular signals with simulated cellular signals;
using a cellular signal generating system,
receiving the ambient cellular signals and ambient Global Navigation Satellite System (abbreviated GNSS) signals using at least one antenna of the cellular signal generating system and determining a location and acceleration of the connected vehicle from the ambient GNSS signals;
accessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient cellular signals along a test path, based on the location determined from the cellular signals;
generating the simulated cellular signals to feed to the connected vehicle, in real time, simulating with at least one vehicle and/or infrastructure source modified according to the augmented environment, based on for the location determined from the cellular signals; and
feeding the simulated cellular signals to a receiver in the connected vehicle, thereby supplanting ambient cellular as the connected vehicle travels along the test path.
| 10. The method of claim 9, wherein the ambient signals include at least one of GNSS, Wi-Fi, 5G and LTE signals that can be manipulated and impaired to test situational awareness of the vehicle in fully controlled and challenging RF environments.
| 11. A tangible non-transitory computer readable storage media impressed with computer program instructions that, when executed, test an autonomous vehicle, including
shielding a Global Navigation Satellite System (abbreviated GNSS) receiving antenna of the autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals;
using a GNSS signal generating system,
receiving the ambient GNSS signals using an antenna of the GNSS signal generating system and determining a location and acceleration of the autonomous vehicle from the ambient GNSS signals;
accessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient GNSS signals along a test path, based on the location determined from the GNSS signals;
generating the simulated GNSS signals to feed to the autonomous vehicle, in real time, simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the location determined from the GNSS signals; and
feeding the simulated GNSS signals to a receiver in the autonomous vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path.
| 12. The tangible non-transitory computer readable storage media of claim 11, further including spoofing by substituting pirate signals for ambient GNSS as the autonomous vehicle travels along the test path.
| 13. The tangible non-transitory computer readable storage media of claim 11, further including wireless and conductive feeds of the simulated GNSS signals.
| 14. The tangible non-transitory computer readable storage media of claim 11, further including using a Faraday cage to shield intent of the autonomous vehicle.
| 15. The tangible non-transitory computer readable storage media of claim 11, further including coupling the received ambient GNSS signals with inertial measurements unit (abbreviated IMU) input to determine the position of the vehicle in real time with reduced latency.
| 16. The tangible non-transitory computer readable storage media of claim 11, further including operating the vehicle on a track and simulating buildings.
| 17. The tangible non-transitory computer readable storage media of claim 11, further including operating the vehicle in an urban environment and combining impaired GNSS signals with object sensors (visual, LIDAR, SONAR, RADAR) used by the vehicle for navigation.
| 18. A system for testing autonomous vehicles includes one or more processors coupled to memory, the memory loaded with computer instructions, that when executed on the processors, implement the shielding, receiving, accessing, generating and feeding of claim 11.
| 19. A tangible non-transitory computer readable storage media impressed with computer program instructions that, when executed, test a connected vehicle that is connected to other vehicles and/or infrastructure, including
shielding a cellular receiving antenna of the connected vehicle from ambient cellular signals while the connected vehicle is under test and supplanting the ambient cellular signals with simulated cellular signals;
using a cellular signal generating system,
receiving the ambient cellular signals and ambient Global Navigation Satellite System (abbreviated GNSS) signals using at least one antenna of the cellular signal generating system and determining a location and acceleration of the connected vehicle from the ambient GNSS signals;
accessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient cellular signals along a test path, based on the location determined from the cellular signals;
generating the simulated cellular signals to feed to the connected vehicle, in real time, simulating with at least one vehicle and/or infrastructure source modified according to the augmented environment, based on for the location determined from the cellular signals; and
feeding the simulated cellular signals to a receiver in the connected vehicle, thereby supplanting ambient cellular as the connected vehicle travels along the test path.
| 20. A system for testing a connected vehicle that is connected to other vehicles and/or infrastructure, includes one or more processors coupled to memory, the memory loaded with computer instructions, that when executed on the processors, implement the shielding, receiving, accessing, generating and feeding of claim 19. | The method involves shielding a GNSS receiving antenna of an autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and the ambient GNSS signals are supplanted with simulated GNSS signals. A GNSS signal generating system is used. The ambient GNSS signals are received using an antenna of the GNSS signal generating system and a location and acceleration of the autonomous vehicle is determined from the ambient GNSS signals. The simulated GNSS signals are generated to feed to the autonomous vehicle, in real time, and constellation of GNSS satellite sources modified is simulated according to the augmented environment, based on the location determined from the GNSS signals. The simulated GNSS signals are feed to a receiver in the autonomous vehicle, thus supplanting ambient GNSS as the autonomous vehicle travels along the test path. INDEPENDENT CLAIMS are included for the following: (1) a method for testing connected vehicle that is connected to other vehicles and infrastructure;(2) tangible non-transitory computer readable storage media storing program for testing autonomous vehicle; and(3) a system for testing autonomous vehicle. Method for testing autonomous vehicle. The cellular and GNSS testing is enhanced using an inertial measurement unit to improve on accuracy of location determination from GNSS signals, especially under jerk conditions. The GNSS correction data is used and additional sensors are integrated into the onboard navigation system, to increase accuracy, availability and integrity. The track and the required environment are first modelled within the three dimensional (3D) environment model simulation software and then used in real time, to calculate the obscuration, multipath and other impairments from the scene. The drawing shows a flow chart of the method for testing autonomous vehicle. 400Method for testing autonomous vehicle 428Vehicle antenna 445GNSS simulator 455Three dimensional environment 465Three dimensional module |
Please summarize the input | Information processing apparatus, information processing method, and mobile body apparatusProvided is an information processing apparatus that creates map information on the basis of sensor information obtained by an on-vehicle sensor. The information processing apparatus includes a creation section that creates a map of a surrounding area of a mobile body on the basis of sensor information acquired by one or more sensors mounted on the mobile body, a request section that issues an information request to an external apparatus on the basis of a state of the map created by the creation section, and a merge section that merges information acquired by the request section from the external apparatus with the created map. The request section issues an information request to the external apparatus on the basis of a condition of a dead angle included in the map created by the creation section.The invention claimed is:
| 1. An information processing apparatus, comprising:
a creation section configured to create a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a first grid map indicating object existence probabilities in respective grids;
a request section configured to:
issue a first information request to an external apparatus based on a state of the map; and
acquire information from the external apparatus, wherein the acquired information is a second grid map;
a merge section configured to merge the first grid map with the second grid map; and
a control section configured to control driving of the first mobile body based on one of a merging result or the map created by the creation section, wherein the merging result is based on the merger of the first grid map with the second grid map.
| 2. The information processing apparatus according to claim 1, wherein the issuance of the first information request to the external apparatus is based on a condition of a dead angle included in the map.
| 3. The information processing apparatus according to claim 1, wherein the issuance of the first information request to the external apparatus is based on detection of a failure in the at least one sensor.
| 4. The information processing apparatus according to claim 1, wherein, in a case where autonomous driving of the first mobile body based on the map is discontinued, the request section is further configured to issue a second information request to the external apparatus.
| 5. The information processing apparatus according to claim 4, wherein, in a case where evacuation of the first mobile body to a safe place is impossible due to a dead angle included in the map created by the creation section, the request section is further configured to issue a third information request to the external apparatus.
| 6. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a second information request to the external apparatus based on a result of comparison of information regarding a current position of the first mobile body with map information.
| 7. The information processing apparatus according to claim 6, wherein, in a case where the map information indicates that a plurality of dead angles from the current position of the first mobile body exists, the request section is further configured to issue a third information request to the external apparatus.
| 8. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a request to the external apparatus for one of map information to complement a dead angle included in the map or sensor information that is used to create a specific map to complement the dead angle.
| 9. The information processing apparatus according to claim 1, wherein the request section is further configured to control issuance of a second information request to the external apparatus, based on the merging result.
| 10. The information processing apparatus according to claim 9, wherein
the information acquired from the external apparatus is merged with the map created at the merge section, and
the request section is further configured to continue issuance of a request to the external apparatus until dead angles included in the map become equal to or less than a specific value, or stop the issuance of the request to the external apparatus when the dead angles included in the map become equal to or less than the specific value.
| 11. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a second information request to a second mobile body.
| 12. The information processing apparatus according to claim 1, wherein
the first mobile body includes a first vehicle, and
the request section is further configured to issue a second information request to a second vehicle through vehicle-to-vehicle communication.
| 13. The information processing apparatus according to claim 1, wherein
each of the creation section, the request section, and the merge section is further configured to perform information processing on the map for each grid.
| 14. An information processing method, comprising:
creating a map of a surrounding area of a mobile body based on sensor information acquired by at least one sensor mounted on the mobile body, wherein the map includes a first grid map indicating object existence probabilities in respective grids;
issuing an information request to an external apparatus based on a state of the map;
acquiring information from the external apparatus, wherein the acquired information is a second grid map;
merging the first grid map with the second grid map; and
controlling driving of the mobile body based on one of a merging result or the created map, wherein the merging result is based on the merger of the first grid map with the second grid map.
| 15. An information processing apparatus, comprising:
a creation section configured to create a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a grid map indicating object existence probabilities in respective grids; and
a providing section configured to provide at least partial information of the map created by the creation section, in response to a request from an external apparatus, wherein the external apparatus controls a second mobile body based on the at least partial information of the map created by the creation section.
| 16. The information processing apparatus according to claim 15, wherein the providing section is further configured to:
receive the request together with position information of the external apparatus, and
provide information of the map to the external apparatus that exists within a specific range from current position information of the first mobile body.
| 17. The information processing apparatus according to claim 15, wherein
the first mobile body includes a first vehicle, and
the providing section is further configured to provide information of the map to a second vehicle through vehicle-to-vehicle communication.
| 18. An information processing method, comprising:
creating a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a grid map indicating object existence probabilities in respective grids; and
providing, by a request section, at least partial information of the created map, in response to a request from an external apparatus, wherein the external apparatus controls a second mobile body based on the at least partial information of the map created.
| 19. A mobile body apparatus, comprising:
a mobile body comprising a mobile body main part;
at least one sensor mounted on the mobile body main part;
a creation section configured to create a map of a surrounding area of the mobile body based on sensor information acquired by the at least one sensor, wherein the map includes a first grid map indicating object existence probabilities in respective grids;
a request section configured to:
issue information request to an external apparatus based on a state of the map; and
acquire information from the external apparatus, wherein the acquired information is a second grid map;
a merge section configured to merge the first grid map with the second grid map; and
a control section configured to control driving of the mobile body main part based on one of a merging result or the map created by the creation section, wherein the merging result is based on the merger of the first grid map with the second grid map. | The information processing apparatus has a preparation unit which produces the map around the mobile object such as vehicle (200) based on the sensor information acquired by one or more sensors mounted in the mobile object. The request unit requests information of an external device based on the state of the map produced by preparation unit. A synthetic unit synthesizes the information obtained from the external device by the request unit with the produced map. The request unit requests the information of the external device based on the condition of the blind spot contained in the map created by the preparation unit. INDEPENDENT CLAIMS are included for the following:the information processing method; andthe mobile object apparatus. Information processing apparatus used during processing of sensor information from vehicle-mounted sensors mounted in vehicle, of vehicle control system. Can also be used in processing of sensor information from sensors mounted in robot, ship, aircraft, unmanned aircraft such as drone, and predetermined working spaces such as home, office and factory. The synthetic unit synthesizes the information obtained from the external device by the request unit with the produced map, so that the provision of the information processing apparatus which complements the blind spot contained in the map information based on own sensor information based on the information from an external device can be possible. The drawing shows an explanatory view illustrating the synthesize of the grid map of the own vehicle and grid map of the surrounding vehicle. (Drawing includes non-English language text) 200Vehicle201-204,221,222Vehicle-mounted cameras210Bicycle220Surrounding vehicle700Grid map |
Please summarize the input | DEVELOPMENT OF WIRELESS RESOURCE AND COMPUTATION OFFLOADING FOR ENHANCED ENERGY EFFICIENCY IN THE INTERNET OF VEHICLES (IOV)The advent of the Internet of Vehicles (IoV) has brought about a paradigm shift in the field of computing, leading to enhanced vehicle intelligence and improved computational services for applications that require high processing power and minimal latency. These applications include autonomous driving, vehicular virtual reality, and real-time traffic control. The Internet of Vehicles (IoV) holds significant potential for extensive development due to the rapid advancements in vehicle wireless connection technologies. Security applications are of paramount importance within the realm of the Internet of Vehicles (IoV) due to their direct impact on vehicle safety. The concept of Vehicle-to-Vehicle (V2V) communication has garnered significant academic interest within the field of intelligent transportation systems (ITS). This technology is recognised for its potential to fulfil the stringent latency and reliability criteria necessary for safety applications. The Internet of Vehicles (IoV) is a nascent concept that is anticipated to play a crucial role in future mobile networks beyond the fifth and sixth generations. Nevertheless, the computational demands and stringent time limitations of Internet of Vehicles (IoV) applications provide a formidable obstacle for vehicle processing units. In order to achieve this objective, multi-access edge computing (MEC) has the potential to utilise the computing resources located at the periphery of the network in order to fulfil the high computational requirements. However, the allocation of computer resources in an optimal manner is a significant challenge due to the presence of multiple parameters, including the quantity of cars, the availability of resources, and the specific demands associated with each individual activity. This study examines a network comprising several vehicles linked to roadside units (RSUs) equipped with mobile edge computing (MEC) capabilities. We present a methodology that aims to minimise the overall energy consumption of the system by concurrently optimising the decision-making process for task offloading, power and bandwidth allocation, and task assignment to MEC-enabled RSUs. In order to address the inherent complexity of the original problem, we employ a strategy of decoupling it into smaller subproblems. To iteratively optimise these subproblems, we utilise the block coordinate descent approach. The numerical findings provide evidence that the suggested system is capable of significantly reducing overall energy usage across different quantities of cars and MEC nodes, all the while ensuring a minimal likelihood of service disruption.|1. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) provides ground work for future research.
| 2. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that the Internet of Things (IoT) refers to a network of physical items that possess the ability to interact, communicate, and exchange data with each other and the surrounding environment through a network, without requiring human interaction.
| 3. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said with the rapid expansion of the Internet of Things (IoT) across all domains, The computational requirements posed by growing automotive applications have presented a significant problem within the context of the Internet of Vehicles (IoVs).
| 4. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said It is anticipated that forthcoming wireless networks will possess the capacity to deliver data and voice services to a substantial quantity of mobile devices (MDs), while also enabling the integration of computational and artificial intelligence (AI) functionalities within these MDs.
| 5. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that in this paper, we analysed and discussed various aspects.
| 6. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that additionally, The emergence of Smart Internet of Vehicles (IoV) as a promising application within the realm of Internet of Things (IoT) can be attributed to the advancements in fifth generation mobile connectivity. | The method involves providing ground work for future research, where an Internet of Things (IoT) refers to a network of physical items that possess an ability to interact, communicate, and exchange data with each other and a surrounding environment through a network without requiring human interaction. Data and voice services are delivered to a quantity of mobile devices, while enabling integration of computational and artificial intelligence (AI) functionalities within the mobile devices. Wireless resource and computer offloading method for internet of vehicles (Iov). The method enables minimizing overall energy consumption of the Iov by concurrently optimizing decision-making process for task offloading, power and bandwidth allocation, and task assignment to mobile edge computing (MEC)-enabled roadside units (RSUs). The method enables efficient utilization of computing resources for facilitating resource sharing. |
Please summarize the input | A PROCESS OF MONITORING AND EVALUATING HUMAN PHYSICAL HEALTH PARAMETERS AND METHOD OF USE BY DASH-CAMThe present invention relates to monitoring, evaluating and reporting physical health parameters of human by a modular dash-cam. Further, it evaluates the driving skills of driver using dash-cam with unique hardware and software capabilities. The dash5 cam comprises a modular rotatable thermal camera with improved Field of view (FOV) that monitors and evaluates person's physical health parameters such as temperature, cough, mask detection, sanitization, oxygen levels measurement etc. and an optical camera monitors security surveillance inside vehicle and upon flipping it starts analyzing the exterior and route. The driver and auto vehicle driving skill results is communicated to multiple surrounding vehicles and pedestrians using similar V2V communication, to alert about potential near risky encounters. The design is scalable to multiple applications like oil/gas leakage, parking assistance, fire detection in vehicle, video game production in low light, AI based movie review, fire detection, drill automation, security monitoring, smart agriculture and integration to PPE smart jacket.|1. A system of determining the health parameters of the driver and/or co-passengers entering in to vehicle utilizing edge and/or cloud computing comprises:a modular dash cam attached to the interior of the windshield comprises; a rotatable thermal camera designed to capture the temperature and other health parameters inside the vehicle; and a flipable or pivoted optical camera to record the activities inside and outside of the vehicle; a processing unit connected to the memory; a plurality of sensors to detect the health parameters; a rechargeable battery assembly; wherein the collected data of health is store in a unstructured distributed way in an artificial intelligence engine and analysed by a processing unit to provide feedback in real-time to the driver.
| 2. A modular dash camera of claim 1, send alerts to the in-vehicle display, driver and/or any emergency contact.
| 3. A modular dash camera of claim 1, wherein the health parameters are such as but not limited to temperature and other body vitals.
| 4. A modular dash camera of claim 1, stores the identity of the driver and co-passenger with a unique identifier to protect privacy.
| 5. A modular dash camera of claim 1, where the thermal and optical camera arrangement is modular in nature and can be interchangeable.
| 6. A modular dash camera of claim 1, wherein the sensors are for detecting oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors.
| 7. A modular dash camera of claim 1, is part of an artificial intelligence engine, where the artificial engine is trained for distributed computing.
| 8. A method of determining the health parameters of the drivers and/or co-passenger entering into the vehicle utilizing edge and cloud computing comprises the steps of: obtain a complete picture of region of interest, with overlapping field of vision with thermal camera; initiating a voice interaction with a microphone and speakers embedded within; conceal identity of the driver and co-passenger by providing a unique identifier; automatic flipping the optical camera for surround view of the interior or exterior of a place or a vehicle; determining the oxygen levels in the area of concern with oximeter installed with camera, and; Reporting via live stream and stream on identification of region and event of interest.
| 9. The method of claim 8, wherein health parameters may be temperature, cough and sneezing, blood pressure, Heart rate and other parameters detectable by dash cam.
| 10. The method of claim 8, sending alerts to the in-vehicle display, driver and/or any emergency contact.
| 11. The method of claim 8, wherein the health parameters are such as but not limited to temperature and other body vitals.
| 12. The method of claim 8, storing the identity of the driver and co-passenger with a unique identifier to protect privacy.
| 13. The method of claim 8, wherein the sensors are for detecting oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors.
| 14. A method of determining the driving skills of the driver using a modular dash cam comprising the steps of: activating optical camera with inbuild AI processing unit with machine learning software and capable of identifying road region in front and, understand scene complexity, traffic signs, speed of car and various other parameters of analysis; comparing with previously trained artificial intelligence algorithms by collecting data at a similar situation during driving with a skilled driver /instructor with precision driving skills; simultaneous processing of thermal images in the software application from thermal camera to understand various driver health analytics such as if driver is drowsy or measure anxiety levels of driver during critical situations or understand precision in right or left turns or roundabouts, or understand whether speed limits were mainlined; creating the skill test report of the driver, based on parameters of analysis and comparison; send the report to the driver and others to improve or rate the driving skills of the driver.
| 15. The method of claim 14, is used by insurance companies to allow or deny the insurance in case of any accident due to driver's mistake.
| 16. The method of claim 14, wherein the driver can designate the controls or take controls in an autonomous vehicle if the skills of driver are not perfect or perfect respectively.
| 17. The method of claim 14, wherein the driver and auto vehicle driving skill with current circumstance is communicated to multiple surrounding vehicles through V2V (vehicle to vehicle) communication technology which can potentially alert multiple other vehicles about potential near risky encounters. | The system has a modular dash cam (810) that is attached to an interior of a windshield. A rotatable thermal camera captures temperature and health parameters inside a vehicle. A flipable or pivoted optical camera records activities inside and outside of the vehicle. Sensors detect the health parameters. A processing unit is connected to a memory. Collected data of health is stored in an unstructured distributed way in an artificial intelligence engine and analyzed by the processing unit to provide feedback in real-time to a driver. The sensors detect oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors. INDEPENDENT CLAIMS are included for the following:a method of determining the health parameters of the drivers and/or co-passenger entering into the vehicle; anda method of determining the driving skills of the driver using a modular dash cam. System for determining health parameters of driver and/or co-passengers entering in vehicle, particularly aeroplane utilizing edge and cloud computing. The method of dynamic pairing between electronic devices, based on the time and proximity of the devices, reduces the possibility for unintentional communications. The system for determining the health parameters of the driver and/or co-passengers entering in to vehicle utilizing edge and cloud computing comprises a modular dash cam attached to the interior of the windshield comprises a rotatable thermal camera designed to capture the temperature and other health parameters inside the vehicle, and a flipable or pivoted optical camera to record the activities inside and outside of the vehicle. The modular dash camera has thermal and optical camera arrangement is modular in nature and can be interchangeable. The drawing shows a schematic view of the dash-cams set up in an aeroplane. 810Modular dash cam812Seat |
Please summarize the input | Domain controller and automatic driving vehicleThe utility model discloses a domain controller and an automatic driving vehicle, wherein the domain controller comprises: from SOC, for performing signal processing to the image detection signal output by the multi-path high-definition camera and the radar data signal output by the multi-path vehicular Ethernet, and outputting the corresponding environment processing signal; a main SOC, the main SOC is connected with the secondary SOC, the main SOC is used for performing signal processing according to the environment processing signal, outputting the corresponding driving planning signal, The technical solution of the utility model is to improve the calculation force and calculation precision of the domain controller of the automatic driving automobile so as to improve the driving safety of the automatic driving automobile.|1. A domain controller, which is applied to automatic driving automobile, the automatic driving automobile comprises a plurality of high definition cameras, a plurality of laser radar and vehicle-mounted Ethernet, wherein the domain controller comprises: a slave SOC, the slave SOC is used for respectively accessing a multi-path high-definition camera and a multi-path vehicle-mounted Ethernet, for performing signal processing on the image detection signal output by the multi-path high-definition camera and the radar data signal output by the multi-path vehicle-mounted Ethernet, and outputting the corresponding environment processing signal, wherein the number of the secondary SOC is at least two; a main SOC, the main SOC is connected with the output end of the secondary SOC, the main SOC is used for performing signal processing according to the environment processing signal, outputting the corresponding driving planning signal, so as to control the function module of the automatic driving automobile to work.
| 2. The domain controller according to claim 1, wherein the number of the secondary SOC is two, which are respectively a first secondary SOC and a second secondary SOC; the first secondary SOC and the second secondary SOC are respectively used with the main SOC. the multi-path high-definition camera is electrically connected with the multi-path vehicle Ethernet; the first secondary SOC is used for performing signal processing on the received multi-path image detection signal and multi-path radar data signal, and outputting a corresponding first environment processing signal; the second secondary SOC is used for performing signal processing on the received multiple paths of image detection signals and multiple paths of radar data signals, and outputting a corresponding second environment processing signal; the main SOC is used for performing signal processing to the received first environment processing signal and/or second environment processing signal, and outputting corresponding driving planning signal.
| 3. The domain controller according to claim 2, wherein the number of the secondary SOC is four, which are respectively the first secondary SOC, the second secondary SOC, the third secondary SOC and the fourth secondary SOC, the first secondary SOC, the second secondary SOC and the third secondary SOC. the third secondary SOC and the fourth secondary SOC are respectively electrically connected with the main SOC; the first secondary SOC is used for outputting the received multi-path image detection signal and multi-path radar data signal to the third secondary SOC and/or the fourth secondary SOC through the main SOC; the second sub-SOC is used for outputting the received multi-path image detection signal and multi-path radar data signal to the third sub-SOC and/or the fourth sub-SOC through the main SOC; the third secondary SOC and the fourth secondary SOC are respectively used for processing the received multi-path image detection signal and multi-path radar data signal, and outputting the corresponding driving planning signal through the main SOC.
| 4. The domain controller according to claim 1, wherein the automatic driving vehicle further comprises a driving component, the domain controller further comprises: a function safety MCU, the function safety MCU is electrically connected with the main SOC, the function safety MCU is used for performing signal processing on the received driving planning signals, and outputting the corresponding driving control signal to the driving component, so as to control the driving route and driving speed of the driving component.
| 5. The domain controller according to claim 4, wherein the domain controller further comprises: a CANFD interface, the CANFD interface is electrically connected with the functional safety MCU, for accessing one or more of millimetre wave radar, ultrasonic radar or vehicle control ECU.
| 6. The domain controller according to claim 4, wherein the domain controller further comprises: a FlexRay interface electrically connected with the functional safety MCU and used for accessing one or more of laser radar, V2X communication module or EIMU detection system.
| 7. The domain controller according to claim 1, wherein the domain controller further comprises: an FAKRK interface, the FAKRK interface is used for electrically connecting with the image detection signal output by the multi-path high definition camera, and accessing the image detection signal output by the multi-path high definition camera; a de-serializing chip electrically connected with the FAKRK interface and the secondary SOC, respectively, for decoding the received image detection signal and outputting the image detection signal to the secondary SOC for signal processing so as to output corresponding environment processing signal; the main SOC is used for processing the received environment processing signal and outputting the corresponding driving planning signal.
| 8. The domain controller according to claim 1, wherein the domain controller further comprises: a plurality of storage modules, a plurality of storage modules are respectively electrically connected with the main SOC and the secondary SOC, a plurality of storage modules are respectively used for storing the corresponding temporary data.
| 9. The domain controller according to claim 1, wherein the domain controller further comprises: a plurality of power supply management modules, a plurality of power supply management modules are respectively electrically connected with the main SOC and the secondary SOC, a plurality of power supply management modules are respectively used for accessing the direct current power supply, and respectively controlling the direct current power supply to access/stop accessing the main SOC and/or the secondary SOC.
| 10. An automatic driving vehicle, comprising multiple high-definition cameras, a vehicle Ethernet and the domain controller according to any one of claims 1 to 9. | The controller has a slave security operation center (SOC) (10) which performs data processing on the image detection signal output by a multi-channel high-definition camera and the radar data signal output by the multi-channel vehicle Ethernet , and outputs the corresponding environment processing signal. A master SOC (30) is connected to the slave SOC. The master SOC performs signal processing according to the environment processing signal, and outputs corresponding driving planning signal, to control the operation of the functional modules of the autonomous driving vehicle. Domain controller for automatic driving automobile (claimed). The computing power and the calculation accuracy of the domain controller of the automatic driving automobile are improved, thus improving the driving safety of the automatic driving automobile. The drawing shows a block diagram of the domain controller. (Drawing includes non-English language text)10Slave SOC 11First Slave SOC 12Second Slave 13Third Slave SOC 20Master SOC 30Functional safety MCU 50Deserialization chip 60Storage Module 70Power management module |
Please summarize the input | TRAFFIC CONTROL USING SOUND SIGNALSMethods for vehicle to vehicle communication, vehicle detection, and vehicle to traffic sign communication are devised. Such methods can involve the use of one or a plurality of speakers to emit artificial sound signals, as well as the use of one or a plurality of sound detectors to record artificial or natural sound signals emitted by nearby vehicles or traffic signs. The use of an active sonar system will also allow autonomous vehicles to detect nearby surroundings. The Doppler Effect can also be used to determine the speeds of moving vehicles. These methods allow autonomous vehicles to drive and respond to their surroundings, and also allow traffic signs to respond to various traffic situations by detecting the presence of nearby vehicles.|1. A method for automobiles for detecting nearby traffic conditions that comprises the following steps:
record sound signals measured by one or a plurality of sound detectors in the automobile,
use the signal processing capabilities of the automobile to analyze the recorded sound signals to identify sound signals emitted by nearby traffic signs or vehicles,
use the sound signals emitted by nearby traffic signs or vehicles to assess surrounding traffic conditions, and
provide traffic information to direct the driving of the automobile.
| 2. The method in claim 1 wherein the step of recording sound signals comprises the step of recording sound signals measured by two or more microphones in the automobile.
| 3. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of comparing the recorded sound signals to a database of already known vehicle noise patterns to determine the types of nearby vehicles.
| 4. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing sound signals coming from different vehicles in order to estimate the number of nearby vehicles.
| 5. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of using the Doppler Effect to determine the relative speeds of nearby vehicles.
| 6. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing sound signals that are in a pre-defined format coming from nearby vehicles.
| 7. The method in claim 6 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing amplitude modulated sound signals coming from nearby vehicles.
| 8. The method in claim 6 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing frequency modulated sound signals coming from nearby vehicles.
| 9. The method in claim 1 further comprises a step that uses active sonar to transmit a sound signal and detect the echo of the transmitted sound in order to detect the surroundings of the automobile.
| 10. The method in claim 1 further comprises a step of transmitting a sound signal that is in a pre-defined format for communicating with nearby vehicles or traffic signs.
| 11. The method in claim 10 comprises a step of transmitting an amplitude modulated sound signal that is in a pre-defined format in order to communicate with nearby vehicles or traffic signs.
| 12. The method in claim 10 comprises a step of transmitting a frequency modulated sound signal that is in a pre-defined format for communicating with nearby vehicles or traffic signs.
| 13. The method in claim 1 is implemented on an autonomous automobile.
| 14. The method in claim 1 further comprises a step of receiving sound signals transmitted by traffic signs.
| 15. The method in claim 14 comprises a step of receiving amplitude modulated sound signals transmitted by traffic signs.
| 16. The method in claim 14 comprises a step of receiving frequency modulated sound signals transmitted by traffic signs.
| 17. A method for detecting nearby traffic conditions for an automobile that comprises the following steps:
transmit sound signals by one or a plurality of sound transmitting devices,
record echoes of said transmitted sound signals measured by one or a plurality of sound detectors in the automobile,
use the signal processing capabilities of the automobile to analyze the recorded echoed sound signals to assess the surroundings of the automobile.
| 18. The method in claim 17 wherein the step of recording sound signals comprises the step of recording sound signals measured by two or more microphones in the automobile.
| 19. The method in claim 17 wherein the step of transmitting sound signals comprises a step of including identification information in the transmitted sound signals.
| 20. The method in claim 17 wherein the step of using the signal processing capabilities of the automobile to analyze echoed sound signals comprises a step of using the Doppler Effect to determine the relative speeds of nearby vehicles. | The method involves recording sound signals measured by multiple sound detectors in the automobile. The signal processing capabilities of the automobile is used to analyze the recorded sound signals to identify sound signals emitted by nearby traffic signs (504) or vehicles (505). The sound signals emitted by nearby traffic signs or vehicles are used for assessing surrounding traffic conditions. The traffic information is provided to direct the driving of the automobile. The recorded sound signals are compared to a database of already known vehicle noise patterns to determine the types of nearby vehicles. The doppler effect is used for determining the relative speeds of nearby vehicles. An INDEPENDENT CLAIM is included for a method for detecting nearby traffic conditions for an automobile. Method for automobiles for detecting nearby traffic conditions by using sound signals with the help of doppler effect. By knowing the type, speed, distance, and direction of each nearby vehicle, the mobile phone is able to rank the level of potential danger that each vehicle poses and provides warnings for the user. The drawing shows a symbolic diagram that shows the traffic conditions near an intersection. 100Pedestrians cell phone109Earphones504Traffic signs505Vehicles581Pedestrian |
Please summarize the input | Mine automatic driving vehicle coordination planning method based on vehicle road cooperationThe invention claims a coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation, comprising the following steps: automatically driving the vehicle road to obtain a high precision map under mine provided by a cloud platform, planning a global smooth navigation path based on the road centre line in the high precision map, and realizing smooth reference line; the automatic driving vehicle bottom planner performs path and speed decision planning, and sends the track information output by the bottom planner as output to the control module; automatically driving the vehicle to run normally according to the planning track of the bottom layer of the vehicle, and performing advanced planning according to the specific condition; respectively planning by path and speed decoupling, iteratively solving the feasible self-vehicle track; the self-vehicle track of each automatic driving vehicle is input to the control module for executing the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing. The invention effectively plans the collision-free track of multiple automatic driving vehicles and improves the running efficiency of the automatic driving vehicle under the interactive scene.|1. A coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation, wherein it comprises the following steps: the automatic driving vehicle road obtains the high precision map under the mine provided by the cloud platform, based on the central line of the road in the high precision map, planning the global smooth navigation path, realizing the complete reference line smoothing; according to the planed smooth reference line, the automatic driving vehicle bottom planner performs path and speed decision planning, the track information output by the bottom planner is used as output and sent to the control module; automatically driving the vehicle to run normally according to the planning track of the bottom layer of the vehicle, triggering the coordination node when meeting the narrow tunnel meeting and the crossing meeting scene, and performing the advanced planning according to the specific condition; the planning result of the high-level planning period replaces the original reference line of the automatic driving vehicle, the path and the speed decoupling are used for planning respectively, and the feasible self-vehicle track is solved iteratively; The vehicle track of each automatic driving vehicle is input to the control module, the control module executes the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing.
| 2. The coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the automatic driving vehicle road obtains the high precision map under mine provided by the cloud platform, based on the road central line in the high precision map, for planning the global smooth navigation path, realizing complete reference line smoothing, comprising the following steps: the automatic driving vehicle road obtains the high precision map under the mine provided by the cloud platform, based on the road central line in the high precision map, firstly planning the global smooth navigation path; wherein the road central line is a discrete point set for smoothing processing as the reference line, the discrete point set of the road central line adopts cubic polynomial connection and uniform sampling encryption central line discrete point, polynomial connection adjacent discrete point (xi, yi) and (xi + 1, yi + 1): y is equal to f (x) = a0 + a1x + a2x2 + a3x3, wherein a0, a1, a2 and a3 respectively represent 0-order term coefficient, 1-order term coefficient, 2-order term coefficient and 3-order term coefficient of the cubic polynomial; planning the smooth reference line, searching the self-vehicle projection point in the discrete point set in a planning period, segmenting based on the projection point, taking the path after segmenting as the path section to be smoothed; converting the reference line smoothing problem into the secondary planning problem based on the sectioned path section composed of the densified discrete points, solving according to the cost function and the constraint condition of the sectioned central line point set smoothing to obtain the smooth reference line point set; finally, splicing the reference line segments with different periods so as to realize smooth reference line.
| 3. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 2, wherein the cost function of the centre line point set smoothing is as follows: wherein w1, w2, w3 are the weight of each item in the cost function, xi, yi and xref, yref are the horizontal and vertical coordinates of the reference line and the density central line, respectively.
| 4. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 3, wherein the high precision map under mine stores the road data and the fixed alignment information of the tunnel under mine as structured data, in the process of performing reference line smoothing processing on the road central line, the projection point of the vehicle on the road central line in each automatic driving vehicle planning period is used as the starting point, the point set in a certain range before and after the smooth starting point, and the point set after smooth is used as the reference line.
| 5. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the automatic driving vehicle bottom planner performs path and speed decision planning according to the planed smooth reference line. The track information output by the bottom planner is used as an output and is sent to the control module, which comprises the following steps: according to the planned smooth reference line, the automatic driving vehicle bottom planner performs path and speed decision planning based on Frenet coordinate system taking the navigation path as coordinate axis, and sends the track information output by the bottom planner as output to the control module; The bottom planner adopts the SLT dimension reduction method to decide the planning process as follows: (1) using SLT dimensionality reduction method to divide into SL layer and ST layer for planning, then constructing path and speed planning problem in SL coordinate system and ST coordinate system: wherein l represents the transverse offset of the automatic driving vehicle path relative to the central line of the road, s represents the longitudinal offset of the automatic driving vehicle path along the central line of the road; t represents the moment corresponding to the longitudinal offset in the speed plan; (2) based on static and low-speed obstacle projection, establishing SL image and discretizing the state space, adopting heuristic search method and numerical optimization method for path decision planning; (3) based on dynamic obstacle track prediction, establishing ST image and discretizing the state space, adopting heuristic search method and numerical optimization method for speed decision planning.
| 6. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 5, wherein the Cartesian coordinate system is converted into Frenet coordinate system in the planning process of the bottom planner. before the track information is sent to the control module, the Frenet coordinate system is converted into the global Cartesian coordinate system.
| 7. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 5, wherein the path planning in the SL diagram and the speed planning in the ST diagram set different non-uniform sampling scales according to the tunnel scene, firstly performing state space discretization, distributing the cost value of each discrete point according to the cost function, adopting the improved A* algorithm heuristic search to quickly obtain the initial solution; the initial solution is used as the decision solution to open the safe space, the original problem based on the safe space is converted into the convex optimized problem, the optimal track solution is obtained by the convex optimized solving method under the constraint condition; In the numerical optimization process of speed planning in the SL diagram, the cost function is: wherein w1, w2, w3, w4, w5 are the weight of each item in the cost function, li, lcentre respectively represent the path in the SL image and the longitudinal offset of the reference line; In the numerical optimization process of the speed planning in the ST diagram, the cost function is: wherein w1, w2, w3, w4 are the weights of each item in the cost function, si, vref respectively represent the transverse displacement of the path and the reference speed in the ST image.
| 8. The coordination planning method of automatic driving vehicle under mine based on vehicle-road cooperation according to claim 7, wherein when meeting narrow tunnel meeting vehicle and road junction meeting vehicle scene, triggering the coordination node, and performing advanced planning according to the specific condition, comprising the following steps: automatically driving the vehicle to drive normally according to the planning track of the bottom layer of the vehicle, triggering the coordination node when meeting the narrow tunnel meeting and the intersection meeting scene, firstly judging whether the original track of the automatic driving vehicle is conflicted, if there is no conflict, driving according to the original track; if there is conflict, forming a conflict area in the vehicle interaction area, forming a buffer coordination area in front of the conflict area of the mine narrow tunnel meeting and the road junction meeting scene; one or more automatic driving vehicles in the mine tunnel drive into the buffer coordination area to reduce speed or stop, and wait for coordination in turn; the coordination node receives the driving maneuvering state of all automatic driving vehicles in the intersection buffer coordination area through V2I communication, wherein the high-level planner performs coordination planning for the vehicles in all buffer coordination areas, generating the coordination reference track of the traffic conflict area, and the coordination reference track only considers the automatic driving vehicle in the buffer coordination area, and does not consider other static or dynamic obstacles; wherein the path generation of the advanced planner adopts a smooth optimization method based on straight line and circular arc, firstly sampling the straight line and circle: Knots: ((xk, m, yk, m, sk, m) m = 0, 1, ..., nk) Anchor points: ((xa, j, ya, j, sa, j) j = 0, 1, ..., na) wherein Knots and Anchor points represent the node and anchor point of the divided straight line and circle, m and j represent the number of the corresponding node and trace point, (xk, m, yk, m, sk, m) respectively represent the transverse and longitudinal coordinates of the node and the total length of the divided straight line, (xa, j, ya, j, sa, j) respectively represent the transverse and longitudinal coordinates of the anchor point and the total length of the divided circle; a reference path between every two adjacent nodes is connected by a quintuple polynomial, and then a smooth feasible path is searched near a straight line and a circular path by an optimization method; the advanced planner performs speed planning on all automatic driving vehicles in the range in the ST graph based on each vehicle path planning result, firstly, the interaction between each vehicle and the conflict area is projected into the ST graph, then the state space discretization is performed and the automatic driving vehicle passing sequence is determined, orderly performing the initial solution search and optimization of the speed; and the speed planning of the high-level planner satisfies the constraint condition that the conflict area is only occupied by the same vehicle at the same time; the automatic driving vehicle enters the buffer coordination area, the coordination node realizes V2I communication with the automatic driving vehicle in the area through the PC5 direct connection communication interface of C-V2X, the road side sensing and vehicle-mounted sensing are connected together by using V2I communication technology, realizing low time delay of data transmission, high reliable requirement, establishing reliable information transmission channel, realizing multi-dimensional, all-aspect sensing information sharing and cooperative scheduling control; when the coordination node judges that there is conflict relation between the automatic driving vehicle track of each intersection and the track of other automatic driving vehicles, the high-level planner determines the planning starting point based on the current mobile state of the automatic driving vehicle, and replans and coordinates all vehicles in the buffer coordination area.
| 9. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 8, wherein the planning result of the high-level planning period replaces the original reference line of the automatic driving vehicle, adopts path and speed decoupling to respectively plan, and iteratively solves the feasible self-vehicle track, The method comprises the following steps: the planning result of the advanced planning period replaces the original reference line of the automatic driving vehicle, each automatic driving vehicle establishes a Frenet coordinate system according to the coordinated reference line, the intersection obstacle information sensed by the vehicle-mounted sensor is projected into the SL image and ST image, the vehicle bottom planner performs replanning, respectively planning by path and speed decoupling, and iteratively solving the feasible self-vehicle track; The time for each automatic driving vehicle output by the advanced planner to enter and exit the conflict area is used as the limit area of the ST diagram in the bottom planning process, so as to ensure that there is no conflict between the output track of the automatic vehicle re-planning and the output track of other automatic driving vehicles: wherein tsl, tel represents the time domain boundary of the passing conflict area of the self-vehicle in the advanced planner speed planning result, tin, tout represents the interaction time of the self-vehicle bottom planner speed planning result and the conflict area; The passing speed of the conflict area should satisfy the speed constraint condition: wherein represents the speed planning result of the bottom planner, and is less than the speed limiting v1 of the conflict area; the high-level planner plans each automatic driving vehicle coordination reference track under the corresponding scene, each automatic driving vehicle bottom planner uses the coordination reference track as input to perform self-vehicle weight planning, so as to avoid various obstacles when passing through the conflict area, the high-level planner and the bottom planner ensure that each vehicle passes through the conflict area safely and harmoniously in turn.
| 10. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the self-vehicle track of each automatic driving vehicle is input to the control module. the control module executes the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing, the self-vehicle track of the automatic driving vehicle is converted by the coordinate system to be input to the control module, wherein the transverse control uses the model prediction control method, the longitudinal control uses the PID control method. | The method involves obtaining a high-precision map of a road in a mine under a mine by an automatic driving vehicle. A global smooth navigation path is planned based on a central line of the road in the high precision map. A planed smooth reference line is realized. A path and speed decision planning process is performed according to the planning track of the bottom layer of the vehicle. A vehicle track of each automatic vehicle is input to a control module. The control module executes transverse and longitudinal control of the automatic vehicle to complete vehicle meeting action through a mine crossing. Coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation. The collision-free track of multiple automatic driving vehicles is effectively planned and the running efficiency of the automatic driving vehicle under the interactive scene is improved. The drawing shows a flow diagram of a planning method. (Drawing includes non-English language text). |
Please summarize the input | REDUNDANT COMMUNICATION METHOD, APPARATUS AND SYSTEM FOR COOPERATIVE AUTONOMOUS DRIVING PLATOONINGThe present disclosure relates to Internet of Vehicles technology, and provides a method, an apparatus, and system for redundant communication for platooning. The method includes: transmitting application data to be transmitted to at least two V2V devices; and controlling the at least two V2V devices that have received the application data to transmit the application data to a predetermined air interface, such that a receiving apparatus obtains the application data from the air interface. With the redundant configuration of the V2V devices, the problem caused by communication failure of one single V2V device can be avoided, so as to ensure stability of V2V communication and guarantee safe operation for platooning.|1-25. (canceled)
| 26. A transmitting apparatus, comprising a first processing device and at least two V2V devices, wherein
the first processing device is configured to transmit application data to the at least two V2V devices, and
the at least two V2V devices are configured to transmit the application data to a predetermined air interface, such that a receiving apparatus obtains the application data from the air interface.
| 27. The transmitting apparatus of claim 26, wherein the first processing device is further configured to:
convert the application data into an Ethernet message, and
transmit the Ethernet message to the at least two V2V devices.
| 28. The transmitting apparatus of claim 27, wherein the at least two V2V devices are further configured to:
packetize the Ethernet message into a V2X message; and
transmit their respectively packetized V2X messages using different frequency bands to air interfaces corresponding to the different frequency bands.
| 29. The transmitting apparatus of claim 28, wherein each of the at least two V2V devices comprises a plurality of antennas, and the at least two V2V devices are further configured to:
transmit their respectively packetized V2X messages using the different frequency bands to the air interfaces corresponding to the different frequency bands via the plurality of antennas provided at each of the at least two V2V devices, wherein each V2V device occupies one frequency band, and the plurality of antennas of each V2V device occupy a same frequency band.
| 30. A receiving apparatus, comprising a second processing device and at least two V2V devices, wherein
each of the at least two V2V devices is configured to obtain application data from an air interface, and
the second processing device is configured to obtain, from each of the at least two V2V devices, the application data corresponding to the V2X device, and fuse and verify the application data to obtain valid data.
| 31. The receiving apparatus of claim 30, wherein:
the air interfaces correspond to a plurality of frequency bands;
each of the at least two V2V devices occupies different one of the plurality of frequency bands and comprises a plurality of antennas;
the plurality of antennas of each V2V device occupy a same frequency band; and
each of the at least two V2V devices is further configured to:
receive V2X messages from air interfaces corresponding to different frequency bands via a plurality of antennas; and
perform signal fusion on the V2X messages received via the plurality of antennas of the V2V device, to form application data information corresponding to the V2V device.
| 32. The receiving apparatus of claim 31, the second processing device is further configured to:
control each of the at least two V2V devices to decode the application data information corresponding to the V2V device, and packetize the decoded application data information into an Ethernet message; and
receive, from each of the at least two V2V devices, the Ethernet message corresponding to the V2V device.
| 33. The receiving apparatus of claim 32, wherein the second processing device is further configured to:
determine, at an end of a current detection period, one or more V2V devices corresponding to the Ethernet message received in the current detection period, the detection period being a predetermined message communication period;
perform, when only one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form a first detection result;
determine whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result.
| 34. The receiving apparatus of claim 33, wherein the second processing device is further configured to:
perform, when more than one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the more than one V2V device as received in the current detection period to form a second detection result;
determine an Ethernet message to be discarded and an Ethernet message to be verified based on the second detection result; and
discard the Ethernet message to be discarded, and verify the Ethernet message to be verified to obtain valid data or invalid data.
| 35. The receiving apparatus of claim 33, wherein the second processing device is further configured to:
determine whether an identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as an expected message identity known in advance;
set a status flag corresponding to the only one V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as the expected message identity known in advance; and
maintain a flag corresponding to the only one V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is different from the expected message identity known in advance.
| 36. The receiving apparatus of claim 35, wherein the second processing device is further configured to:
determine whether the flag corresponding to the only one V2V device is the initial flag or the first flag;
determine that the Ethernet message corresponding to the only one V2V device as received in the current detection period is invalid data when the flag corresponding to the only one V2V device is the initial flag; and
determine that the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data when the flag corresponding to the only one V2V device is the first flag.
| 37. The receiving apparatus of claim 33, wherein the second processing device is further configured to:
determine whether an identity of the Ethernet message corresponding to each V2V device as received in the current detection period is same as an expected message identity known in advance;
set a status flag corresponding to each V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the V2V device as received in the current detection period is same as the expected message identity known in advance; and
maintain a flag corresponding to each V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the V2V device as received in the current detection period is different from the expected message identity known in advance.
| 38. The receiving apparatus of claim 37, the second processing device is further configured to:
determine whether the flag corresponding to each V2V device is the initial flag or the first flag;
determine that the Ethernet message corresponding to each V2V device as received in the current detection period is an Ethernet message to be discarded when the flag corresponding to the V2V device is the initial flag; and
determine that the Ethernet message corresponding to each V2V device as received in the current detection period is an Ethernet message to be verified when the flag corresponding to the V2V device is the first flag.
| 39. The receiving apparatus of claim 37, the second processing device is further configured to:
calculate data bits in the Ethernet message to be verified corresponding to each V2V device in accordance with a predetermined algorithm to obtain a calculation result corresponding to the V2V device, the predetermined algorithm comprising addition, multiplication, MD5 message digest algorithm;
compare the calculation results;
determine the Ethernet message to be verified corresponding to each V2V device to be same, and determine the same Ethernet messages to be verified corresponding to the V2V device as valid data, when the calculation results are same; and
determine the Ethernet message to be verified corresponding to each V2V device as invalid data, when different calculation results exist in the calculation results.
| 40. A method for redundant communication for platooning, comprising:
controlling at least two V2V devices to obtain application data from an air interface; and
obtaining, from the at least two V2V devices, the application data corresponding to the V2X device; and
fusing and verifying the application data to obtain valid data.
| 41. The method of claim 40, wherein:
the air interfaces correspond to a plurality of frequency bands;
each of the at least two V2V devices occupies different one of the plurality of frequency bands and comprises a plurality of antennas;
the plurality of antennas of each V2V device occupy a same frequency band; and
said controlling the at least two V2V devices to obtain the application data from the air interface comprises:
controlling the at least two V2V devices to receive V2X messages from air interfaces corresponding to different frequency bands via a plurality of antennas of each V2V device; and
controlling each of the at least two V2V devices to perform signal fusion on the V2X messages received via the plurality of antennas of the V2V device, to form application data information corresponding to the V2V device.
| 42. The method of claim 41, wherein said obtaining, from the at least two V2V devices, the application data corresponding to the V2X device comprises:
controlling each of the at least two V2V devices to decode the application data information corresponding to the V2V device, and packetize the decoded application data information into an Ethernet message; and
receiving, from each of the at least two V2V devices, the Ethernet message corresponding to the V2V device via a router or a switch.
| 43. The method of claim 42, wherein said fusing and verifying the application data to obtain the valid data comprises:
determining, at an end of a current detection period, one or more V2V devices corresponding to the Ethernet message received in the current detection period, the detection period being a predetermined message communication period;
performing, when only one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form a first detection result;
determining whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result;
performing, when more than one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the more than one V2V device as received in the current detection period to form a second detection result;
determining an Ethernet message to be discarded and an Ethernet message to be verified based on the second detection result; and
discarding the Ethernet message to be discarded, and verifying the Ethernet message to be verified to obtain valid data or invalid data.
| 44. The method of claim 43, wherein
said performing the message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form the first detection result comprises:
determining whether an identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as an expected message identity known in advance;
setting a status flag corresponding to the only one V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as the expected message identity known in advance; and
maintaining a flag corresponding to the only one V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is different from the expected message identity known in advance, and
said determining whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result comprises:
determining whether the flag corresponding to the only one V2V device is the initial flag or the first flag;
determining that the Ethernet message corresponding to the only one V2V device as received in the current detection period is invalid data when the flag corresponding to the only one V2V device is the initial flag; and
determining that the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data when the flag corresponding to the only one V2V device is the first flag.
| 45. A non-transitory computer readable storage medium, having a computer program stored thereon, the program comprising code configured to perform a method for redundant communication for platooning of claim 40. | The method, involves sending application data to at least two Vehicle-to-vehicle devices (101). The at least two Vehicle-to-vehicle devices is controlled and received the application data to apply the application (102). The data is send to the preset air interface, so that the received device obtains the application data from the air interface. The application data to be send to at least two Vehicle-to-vehicle (V2V) devices has converted the application data to be send into application data Ethernet packets and passed the router or the switch sends the application data Ethernet message to at least two V2V devices. An INDEPENDENT CLAIM is included for the following:a sending end device;a receiving terminal device;a computer readable storage medium on which a computer program is stored; anda synergy automatic driving vehicles of the redundant communication system. Redundant communication method for collaborative autonomous driving fleet. The method ensures stability of V2V communication. The drawing shows the flow chart of the method. 101Involves sending application data to at least two Vehicle-to-vehicle devices102At least two Vehicle-to-vehicle devices is controlled and received the application data to apply the application |
Please summarize the input | Road cloud cooperative automatic driving control method of road end main controlThe invention claims a road cloud cooperative automatic driving control method and method for road end main control, comprising the following steps: S1, building a cloud platform database: S1.3, an ID is allocated for each vehicle in the static road environment model and a corresponding file library is established according to the ID, the vehicle comprises an automatic driving vehicle and a non-automatic driving vehicle, the automatic driving vehicle is allocated with a permanent ID, the non-automatic driving vehicle is allocated with a temporary ID, when the non-automatic driving vehicle drives out of the whole control area for 1 week, the cloud platform automatically deletes the temporary ID of the non-automatic driving vehicle and the corresponding file base; S2, determining the local path planning of the controlled vehicle; The invention endows the automatic driving vehicle and the non-automatic driving vehicle with the ID, and establishes a matched file library, and the data of the non-automatic driving vehicle is periodically deleted according to the rule, which ensures that the database will not store excessive and unused information, which is convenient for the rapid management and application between the edge computing centre and the cloud platform, The transmission is updated.|1. A vehicle road cloud cooperative automatic driving control method for road end main control, wherein The control architecture comprises: S1, constructing a cloud platform database: S1.1, the cloud platform collects the high-precision map through the road side terminal, extracts the content related to the driving and removes the unrelated information; S1.2, establishing a dimension-reducing static road environment model according to the content related to the driving; S1.3, distributing ID for each vehicle in the static road environment model and establishing corresponding file library according to the ID, the vehicle comprises an automatic driving vehicle and a non-automatic driving vehicle, the automatic driving vehicle is distributed with permanent ID, the non-automatic driving vehicle is distributed with temporary ID, when the non-automatic driving vehicle drives out of the whole control area for 1 week, the cloud platform automatically deletes the temporary ID of the non-automatic driving vehicle and the corresponding file base; S1.4, the information of the static road environment model updated by S1.3 is recorded to the database in real time and shared to each edge computing centre according to the area fragment, and returned to S1.2 for updating by timing or event trigger; S2, determining the local path planning of the controlled vehicle: S2.1, the cloud platform determines the controlled vehicle according to the application requirement, and determines the vehicle scheduling instruction and the overall path planning according to the static road environment model; S2.2, the edge computing centre receives the road environment model, the vehicle scheduling instruction and the overall path planning; establishing a real-time dynamic traffic environment model according to the road environment model and the real-time collected dynamic data; S2.3, the edge computing centre according to the real-time dynamic traffic environment model, according to the vehicle scheduling instruction and total path planning for each automatic driving vehicle for local path planning; S3, the controlled vehicle executes the local path planning.
| 2. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S1.3, the archive of the auto-driving vehicle includes basic information and dynamic information of the vehicle, and the archive of the non-auto-driving vehicle includes vehicle information sensed by the roadside device.
| 3. The road cloud cooperative automatic driving control method for road end main control according to claim 2, wherein the basic information comprises vehicle type, size, power parameter, braking parameter, steering ability, real-time electric quantity, fault state, history state and maintenance record; the dynamic information is the self state data uploaded by the vehicle in real time.
| 4. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S1.4, the event is an important road event reported by an edge computing centre, a vehicle or a person.
| 5. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S2.3, the local path planning is updated according to the frequency of 50 HZ, the local path planning comprises ten planning path points, each planning path point carries a coordinate point and a time to reach the path point, and the adjacent distance of the ten planning path points is inversely proportional to the vehicle speed.
| 6. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein in S1-S3, the information with high time delay requirement of the controlled vehicle realizes V2I direct connection communication through the PC5 interface of the 5G-OBU module and the edge computing centre, the information with low time delay requirement of the controlled vehicle realizes V2N communication through the Uu interface of the 5G-OBU and the cloud platform; each edge computing centre communicates with the cloud platform through the optical fibre Ethernet. | The vehicle road cloud cooperative automatic driving control method involves constructing a cloud platform database. A cloud platform collects a high-precision map through a road side terminal. A real-time dynamic traffic environment model is established according to a road environment model and real time collected dynamic data. A local path planning of a controlled vehicle is determined. The cloud platform determines the controlled vehicle according to an application requirement. A vehicle scheduling instruction and an overall path planning are determined according to the static road environmental model. The edge computing center receives the road environment models, the vehicle scheduling instructions and the overall path plans. Vehicle road cloud cooperative automatic driving control method for road end main control of vehicle, such as automobile. The data of the non-automatic driving vehicle is periodically deleted according to the rule, which ensures that the database will not store excessive and unused information, which is convenient for the rapid management and application between the edge computing center and the cloud platform. The drawing shows a flow chart of the vehicle road cloud cooperative automatic driving control method. (Drawing includes non-English language text). |
Please summarize the input | Method of using a multi-input and multi-output antenna (MIMO) array for high-resolution radar imaging and wireless communication for advanced driver assistance systems (ADAS) and autonomous drivingA method of using a multi-input multi-output (MIMO) antenna array for high-resolution radar imaging and wireless communication for advanced driver assistance systems (ADAS) utilizes a MIMO radar and at least one base station. The MIMO radar establishes wireless communication with the base station via an uplink signal. Likewise, the base station sends a downlink signal to the MIMO radar. Further, unlike conventional vehicle-to-everything (V2X) systems that filter the reflected uplink signal, the MIMO radar uses the reflected uplink signal to detect a plurality of targets. Accordingly, the MIMO radar derives spatial positioning data for each target from the reflected uplink signal.What is claimed is:
| 1. A method of using a multi-input and multi-output (MIMO) antenna array for radar imaging and wireless communication for advanced driver assistance systems (ADAS) and autonomous driving, the method comprises the steps of:
(A) providing a multi-input and multi-output (MIMO) radar and at least one base station;
(B) transmitting an uplink signal from the MIMO radar to the at least one base station;
(C) receiving a downlink signal from the at least one base station with the MIMO radar;
(D) receiving a reflected uplink signal with the MIMO radar, wherein the reflected signal is reflected off objects surrounding the MIMO radar;
providing a plurality of transmitters, a plurality of receivers, and a RF controller for the MIMO radar;
providing a PN-code regulator managed by the MIMO radar, wherein the reflected uplink signal is encoded with a spread spectrum coding scheme;
receiving an ambient signal with the MIMO radar;
cancelling a cross-talk portion of the ambient signal with the RF controller during step (D), wherein the cross-talk portion is generated from direct communication between the plurality of transmitters and the plurality of receivers;
filtering the reflected uplink signal from the ambient signal with the RF controller during step (D);
dispreading the reflected uplink signal through the PN-code regulator with the RF controller;
estimating a detection time delay for the spatial positioning data for each target with the RF controller;
(E) processing communication data from the downlink signal with the MIMO radar;
(F) detecting a plurality of targets within the reflected uplink signal with the MIMO radar; and
(G) deriving spatial positioning data for each target from the reflected uplink signal with the MIMO radar.
| 2. The method as claimed in claim 1 further comprises the steps of:
providing a pseudo-noise (PN) generator managed by the MIMO radar; and
encoding the uplink signal through the PN generator with the MIMO radar during step (B), wherein a spread spectrum coding scheme is applied to the uplink signal by the PN generator.
| 3. The method as claimed in claim 1 comprises:
providing a RF controller for the MIMO radar;
receiving an ambient signal with the MIMO radar; and
filtering the downlink signal from the ambient signal with the RF controller during step (C).
| 4. The method as claimed in claim 1 further comprises the steps of:
providing an adaptive noise canceller for the MIMO radar; and
capturing the cross-talk portion of the ambient signal with the adaptive noise canceller.
| 5. The method as claimed in claim 1 further comprises the steps of:
executing a plurality of iterations for steps (B) through (G);
transmitting an omni-directional uplink signal during step (B) of an initial iteration, wherein the initial iteration is from the plurality of iterations;
receiving a reflected omni-directional uplink signal during step (D) of the initial iteration; and
detecting a plurality of targets during step (F) of the initial iteration.
| 6. The method as claimed in claim 1 further comprises the steps of:
executing a plurality of iterations for steps (B) through (G);
beamforming a uni-directional uplink signal towards each target detected in a previous iteration during step (B) of an arbitrary iteration, wherein the arbitrary iteration is any iteration from the plurality of iterations, and wherein the previous iteration precedes the arbitrary iteration in the plurality of iterations;
receiving a uni-directional reflected uplink signal for each target detected in the previous iteration during step (D) of the arbitrary iteration; and
detecting a plurality of targets during step (F) of the arbitrary iteration, wherein each target corresponds to the uni-directional reflected uplink signal for each target detected in the previous iteration. | The method involves transmitting an uplink signal from a Multi-Input Multi-Output (MIMO) radar to at least one base station, and receiving a downlink signal from the at least one base station with the MIMO radar. A reflected uplink signal reflected off objects surrounding the MIMO radar is received with the MIMO radar. Communication data from the downlink signal is processed with the MIMO radar. Multiple targets within the reflected uplink signal are detected with the MIMO radar. Spatial positioning data for each target is derived from the reflected uplink signal with the MIMO radar. Method of using a Multi-Input Multi-Output (MIMO) antenna array for high-resolution radar imaging and communication for Advanced Driver Assistance Systems (ADAS) and autonomous driving of vehicle. By encoding the reflected uplink signal with a spread spectrum coding scheme, the bandwidth of the uplink signal is spread and the uplink signal is made more resistant to jamming and noise. Beamforming is made possible by transmitting in-phase signals through each antenna in the antenna array which allows the transmittance of the high-energy uni-directional uplink signal towards each target. The drawing is a schematic diagram of a vehicle communication and radar sensing system. |
Please summarize the input | AUTONOMOUS VEHICLE ACTIVE INTERACTION WITH SURROUNDING ENVIRONMENTAn automated vehicle (AV) which automatically interacts with objects in a surrounding environment based on the objects determined intention and predicted actions determined based on their intention. Data is collected from an external environment by cameras, sensors, and optionally other devices on an AV. The data is processed to identify objects and a state for each object, and an interaction scenario is identified. For objects within the interaction scenario, an intention for each object is determined, and the action of the object is predicted. The AV generates a decision to perform an action to communicate the AV's action to one or more objects. Commands are generated to execute the decision, and the intention of the AV is implemented by executing the commands using one or more output mechanisms (horn, turn signal, display, and/or other mechanisms) for the
AV.|1. An autonomous vehicle system for automatically interacting with a surrounding environment, the system comprising:
a data processing system comprising one or more processors, a memory, a planning module, and a control module, the data processing system to:
detect, from received sensor data, an object in an interaction scenario in an external environment;
monitor the object in response to the detecting the interaction scenario;
determine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state;
generating an object prediction based on the determined object intention and the detected object state; and
generate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object.
| 2. The system of claim 1, the data processing system further to:
predict an action of the object based on the determined intention of the object; and
determine an action to indicate the intention of the autonomous vehicle to the object, the one or more commands generated to implement the action.
| 3. The system of claim 1, the data processing system further to:
detect the interaction scenario based on the received sensor data; and
monitor an activity of the object within the interaction scenario.
| 4. The system of claim 1, wherein the object includes a pedestrian or a vehicle.
| 5. The system of claim 1, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.
| 6. The system of claim 1, wherein the received sensor data include semantic information to describe the object.
| 7. The system of claim 1, wherein the received sensor data includes one or more of the following: a vehicle location, a vehicle action, a pedestrian location, and a pedestrian action.
| 8. The system of claim 1, wherein the data processing system is further configured to select the intention of the autonomous vehicle based on the current object state.
| 9. The system of claim 1, wherein the autonomous vehicle is configured to signal the intention via at least one of the following: a visual indication, an audio indication, and a Vehicle-to-everything (V2X) communication.
| 10. The system of claim 1, wherein the interaction of the autonomous vehicle is determined at least in part on policies associated with traffic rules.
| 11. The system of claim 1, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.
| 12. A method for automatically interacting with a surrounding environment by an autonomous vehicle, the method comprising:
detecting, by a data processing system from received sensor data, an object in an interaction scenario in an external environment;
monitoring the object in response to the detecting the interaction scenario;
determine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state;
generating an object prediction based on the determined object intention and the detected object state; and
generate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object.
| 13. The method of claim 12, the data processing system further to:
predict an action of the object based on the determined intention of the object; and
determine an action to indicate the intention of the autonomous vehicle to the object, the one or more commands generated to implement the action.
| 14. The method of claim 12, the data processing system further to:
detect the interaction scenario based on the received sensor data; and
monitor an activity of the object within the interaction scenario.
| 15. The method of claim 12, wherein the object includes a pedestrian or a vehicle.
| 16. The method of claim 12, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.
| 17. The method of claim 12, wherein the received sensor data include semantic information to describe the object.
| 18. The method of claim 12, wherein the received sensor data includes one or more of the following: a vehicle location, a vehicle action, a pedestrian location, and a pedestrian action.
| 20. A non-transitory computer readable storage medium having embodied thereon a program, the program being executable by a processor to perform a method for automatically interacting with a surrounding environment by an autonomous vehicle, the method comprising:
detecting, by a data processing system from received sensor data, an object in an interaction scenario in an external environment;
monitoring the object in response to the detecting the interaction scenario;
determine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state;
generating an object prediction based on the determined object intention and the detected object state; and
generate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object. | The autonomous vehicle system comprises a data processing system having multiple processors, a memory, a planning module (412), and a control module (414). The data processing system detects an object in an interaction scenario in an external environment (510) from received sensor data. The object is monitored in response to the detecting the interaction scenario. An intention is determined for the object within the external environment based on the monitoring. An object prediction is generated based on the determined object intention and the detected object state. The commands are generated to indicate an intention of the autonomous vehicle in response to the generated prediction of the object. INDEPENDENT CLAIMS are included for the following:a method for automatically interacting with a surrounding environment by an autonomous vehicle; anda non-transitory computer readable storage medium. Autonomous vehicle system for automatically interacting with a surrounding environment. Enhances the safety and efficiency of the automated vehicle at intersections. The drawing shows a block representation of a system for automatically interacting with a surrounding environment by an autonomous vehicle.412Planning module414Control module420Perception Module510External environment530Monitoring Module |
Please summarize the input | Surface Detection Via a Directed Autonomous VehicleA number of illustrative variations may include the steps of providing a first vehicle including at least one sensor, a controller configured to process sensor data, and a vehicle communication system; providing a driving surface having an actual coefficient of friction; determining at least one estimated driving surface coefficient of friction; communicating the at least one estimated driving surface coefficient from the first vehicle to the vehicle communication system; and communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle directly or indirectly.What is claimed is:
| 1. A method comprising:
providing a first vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system;
driving the first vehicle on a driving surface having an actual coefficient of friction;
determining at least one estimated driving surface coefficient of friction;
communicating the at least one estimated driving surface coefficient from the first vehicle to the vehicle communication system;
communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle.
| 2. A method as set forth in claim 1 wherein the first vehicle further comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and
wherein determining at least one estimated driving surface coefficient of friction comprises manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle.
| 3. A method as set forth in claim 1 wherein determining at least one estimated driving surface coefficient of friction is accomplished via the at least one sensor.
| 4. A method as set forth in claim 1 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.
| 5. A method as set forth in claim 1 wherein the vehicle communication system is a vehicle-to-everything communication system.
| 6. A method as set forth in claim 1 wherein the first vehicle is an unmanned ground vehicle.
| 7. A method as set forth in claim 1 wherein the first vehicle is an unmanned aerial vehicle.
| 8. A method as set forth in claim 1, further comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of a braking system, a steering system, or a propulsion system of the at least one other vehicle.
| 9. A method comprising:
providing an unmanned ground vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system;
driving the first vehicle on a driving surface having an actual coefficient of friction;
determining at least one estimated driving surface coefficient of friction via the at least one sensor; and
communicating the at least one estimated driving surface coefficient from the unmanned ground vehicle to the vehicle communication system.
| 10. A method as set forth in claim 9, further comprising communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle.
| 11. A method as set forth in claim 10 wherein the at least one other vehicle comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and
further comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of the braking system, steering system, and propulsion system of the other vehicle.
| 12. A method as set forth in claim 9 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.
| 13. A method as set forth in claim 9 wherein the vehicle communication system is a vehicle-to-everything communication system
| 14. A method as set forth in claim 9 wherein determining at least one estimated driving surface coefficient of friction via the at least one sensor additionally comprises performing unmanned ground vehicle maneuvers comprising manipulating at least one of vehicle speed, acceleration, direction, or braking.
| 15. A method comprising:
providing an unmanned aerial vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system;
determining at least one estimated driving surface coefficient of friction of a driving surface via the at least one sensor; and
communicating the at least one estimated driving surface coefficient from the unmanned aerial vehicle to the vehicle communication system.
| 16. A method as set forth in claim 15, further comprising communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle.
| 17. A method as set forth in claim 16 wherein the at least one other vehicle comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and
further comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of the braking system, steering system, or propulsion system of the at least one other vehicle.
| 18. A method as set forth in claim 16 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.
| 19. A method as set forth in claim 16 wherein the vehicle communication system is a vehicle-to-everything communication system.
| 20. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed at the maximum capability of the first vehicle.
| 21. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without a passenger in the vehicle at the a capability of the first vehicle that would otherwise result in injury to a passenger in the vehicle.
| 22. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without cargo in the vehicle at the a capability of the first vehicle that would otherwise result in damage to a cargo in the vehicle.
| 23. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed at the maximum capability of the first vehicle.
| 24. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without a passenger in the vehicle at the a capability of the first vehicle that would otherwise result in injury to a passenger in the vehicle.
| 25. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without cargo in the vehicle at the a capability of the first vehicle that would otherwise result in damage to a cargo in the vehicle. | The method involves providing a vehicle (14) with a sensor (16), where a control unit (18) is set up to process a sensor data (20). The vehicle communication system (22) is provided, where the vehicle is driven (24) on a roadway (26) with an actual coefficient of friction (28). The estimated road surface friction coefficient (32) is determined (30), where the estimated road surface coefficient is communicated (34) from the vehicle to the vehicle communication system. The estimated road surface coefficient is communicated (36) from the vehicle communication system to another vehicle (38). The vehicle is provided with a braking system (40) to actuate a set of brakes (42). Method for performing the surface detection by a guided autonomous vehicle. The control unit is set up to process a sensor data, where the vehicle is driven on a roadway with an actual coefficient of friction, and hence enables preventing the unintended imbalances in the driving force transferred from each wheel to a vehicle and performs the surface detection by a guided autonomous vehicle effectively. The drawing shows a flowchart of a method for performing the surface detection by a guided autonomous vehicle. 12Providing a vehicle with a sensor14,38Vehicles16Sensor18Control unit20Sensor data22Vehicle communication system24Driving a vehicle on a roadway with an actual coefficient of friction26Roadway28Actual coefficient of friction30Determining an estimated road surface friction coefficient32Estimated road surface friction coefficient34Communicating an estimated road surface coefficient from a vehicle to a vehicle communication system36Communicating a road surface coefficient from a vehicle communication system to another vehicle40Braking system42Brakes |
Please summarize the input | Safety method for a modular autonomous vehicle and a control device thereforA safety method, performed by a control device for a vehicle assembled from a set of modules, the vehicle including at least two modules, including at least one drive module and at least one functional module. The control device is in any of the at least two modules. The at least one drive module has a pair of wheels and is configured to be autonomously operated. The method includes detecting (s 101) an emergency situation in any of the at least two modules of the assembled vehicle, transmitting (s102) information about the detected emergency situation to a control center and controlling (s103) the module associated with the emergency situation to physically disconnect from the assembled vehicle. Also to a computer program, a computer-readable medium, a control device and a vehicle are included.The invention claimed is:
| 1. A safety method, performed by a control device for a vehicle assembled from a set of modules, the vehicle comprising one or more of at least two modules, including:
at least one drive module; and
at least one functional module;
wherein the control device is comprised in any one or more of the at least two modules and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated;
the method comprising:
detecting, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle;
transmitting, by a transmitter, information about the detected emergency situation to a control center; and
controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle.
| 2. The method according to claim 1, further comprising:
controlling the disconnected module to move away from the at least one remaining module of the assembled vehicle and/or controlling the at least one remaining module of the assembled vehicle to move away from the disconnected module.
| 3. The method according to claim 1, wherein, after transmitting information about the detected emergency situation to a control center, and before controlling the module associated with the emergency situation, physically disconnecting the module from the assembled vehicle;
the method further comprising:
receiving, from the control center, a command to physically disconnect the at least one drive module from the assembled vehicle.
| 4. The method according to claim 1, wherein, before controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle, the method further comprises:
activating an alarm informing about the emergency situation.
| 5. The method according to claim 1, further comprising:
identifying a safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment; and
controlling the assembled vehicle to move to the identified safe space prior to physically disconnecting the module.
| 6. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is performed by means of a second sensor element comprising a radar, a lidar or a camera.
| 7. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on information from the control center via 4G, 5G, V2I, Wi-Fi or any other wireless communication means.
| 8. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of the at least one functional module.
| 9. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of load in the at least one functional module.
| 10. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of emergency situation in the assembled vehicle.
| 11. The method according to claim 1, wherein controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle also comprises controlling the module to electrically disconnect from the assembled vehicle.
| 12. The method according to claim 1, wherein the assembled vehicle comprises two drive modules and the at least one functional module, and wherein one of the drive modules is configured to operate as a master and the other drive module is configured to operate as a slave;
the method further comprises, when an emergency situation is detected in the master drive module:
controlling the drive module configured to operate as a slave to operate as master.
| 13. The method according to claim 12, further comprising, when an emergency situation is detected in the at least one functional module:
controlling both drive modules to physically disconnect from the assembled vehicle.
| 14. The method according to claim 1, further comprising detecting an emergency situation by means of the first sensor element including a temperature sensor, a pressure sensor, a smoke sensor, a particle sensor, a gas sensor and/or a camera arranged on the assembled vehicle.
| 15. A computer memory storing program instructions which, when the program instructions are executed by a computer, causes the computer to carry out a method performed by the computer for a vehicle assembled from a set of modules, wherein the vehicle comprises one or more of at least two modules including: at least one drive module and at least one functional module, wherein the computer is comprised in any one or more of the at least two modules and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated, wherein the method comprises:
detecting, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle;
transmitting, by a transmitter, information about the detected emergency situation to a control center; and
controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle.
| 16. A control device of a vehicle assembled from a set of modules, the vehicle comprising one or more of at least two modules, including:
at least one drive module; and
at least one functional module;
wherein the control device is comprised in any one or more of the at least two modules, and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated;
the control device being configured to:
detect, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle;
transmit, by a transmitter, information about the detected emergency situation to a control center; and
control the module associated with the detected emergency situation to physically disconnect from the assembled vehicle.
| 17. A vehicle assembled from a set of modules, wherein the vehicle comprises at least one control device, wherein the set of modules comprises one or more of at least two modules including: at least one drive module, and at least one functional module; and wherein the control device is comprised in any one or more of the at least two modules; and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated; and the control device being configured to:
detect, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle;
transmit, by a transmitter, information about the detected emergency situation to a control center; and
control the module associated with the detected emergency situation to physically disconnect from the assembled vehicle. | The method involves detecting (s101) an emergency situation in any of two modules of an assembled vehicle. The information about the detected emergency situation is transmitted (s102) to a control centre. The module associated with the emergency situation is controlled (s103) to physically disconnect from the assembled vehicle. The disconnected module is controlled to move away from the remaining module of the assembled vehicle, and/or the remaining module of the assembled vehicle is controlled to move away from the disconnected module. The command is received to physically disconnect the drive module from the assembled vehicle. INDEPENDENT CLAIMS are included for the following:a computer program for vehicle assembled from set of modules;a control device of vehicle assembled from set of modules; anda vehicle assembled from set of modules. Safety method for vehicle such as bus and truck assembled from set of modules. The assembled vehicle is quickly and easily disassembled without manual work. The safe distance to the surrounding objects is maintained, and the accidents are avoided. The assembled vehicle is controlled to move to the identified safe space prior to physically disconnecting the module. The drawing shows a flowchart illustrating the safety process. s101Step for involves detecting emergency situation in any of two modules of assembled vehicles102Step for transmitting information about detected emergency situation to control centres103Step for controlling module associated with the emergency situation to physically disconnect from assembled vehicle |
Please summarize the input | FRICTION MONITORING SYSTEM FOR A VEHICLE AND A METHOD PERTAINING TO SUCH A SYSTEMA friction monitoring system (2) for vehicles (4, 16) which comprises a slipperiness detection device (6) suited to making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle (4), to determining at least one friction value on the basis of the measurement and to generating a friction signal (8) comprising said friction value determined. Also provided is a processing device (10) adapted to receiving said friction signal (8) and to generating a slipperiness information signal (12) comprising said friction value. The friction monitoring system (2) comprises also a first communication device (14) situated in the first vehicle (4) and adapted to receiving said slipperiness information signal (12) and to transmitting a processed slipperiness information signal (15) wirelessly in a format such that one or more other vehicles (16) can receive the processed signal (15), process it and, where necessary, activate at least one skid protection system (17) in said other vehicle on the basis of the information contained in the processed signal (15), said slipperiness information signal (12) being arranged to be passed on and, where necessary, to activate at least one skid protection system (22) of the first vehicle (4) in accordance with a set of dynamic activation rules.|1. A friction monitoring system (2) for vehicles (4, 16) which comprises a slipperiness detection device (6) suited to making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle (4), to determining at least one friction value on the basis of the measurement and to generating a friction signal (8) comprising said friction value determined, a processing device (10) adapted to receiving said friction signal (8) and to generating a slipperiness information signal (12) comprising said friction value, a first communication device (14) situated in said first vehicle (4) and adapted to receiving said slipperiness information signal (12) and to transmitting a processed slipperiness information signal (15) wirelessly in a format such that one or more other vehicles (16) can receive the processed signal (15), process it and, where necessary, activate at least one skid protection system (17) in said other vehicle on the basis of the information in the processed signal received (15), said slipperiness information signal (12) is arranged to be passed on and, where necessary, to activate at least one skid protection system (22) of the first vehicle (4) in accordance with a set of dynamic activation rules characterised in that said set of dynamic activation rules comprises parameters related to nearby vehicles.
| 2. The friction monitoring system (2) according to claim 1, in which said nearby vehicles are part of a vehicle train.
| 3. The friction monitoring system (2) according to claim 2, in which said parameters comprise the length of the vehicle train.
| 4. The friction monitoring system (2) according to any one of claims 1-3, which comprises a location determination device (19) adapted to determining the location of the first vehicle, to determining a location value on the basis of the location determined and to generating a location signal (20) on the basis of said location value determined, said processing device (10) being adapted to receiving said location signal (20) and to generating said slipperiness information signal (12) comprising coordinated friction values and location values.
| 5. The friction monitoring system (2) according to claim 4, in which the processing device (10) is adapted to relating each friction value to a location value so that a specific friction value unambiguously indicates how slippery the roadway is at a given location.
| 6. The friction monitoring system (2) according to any one of claims 1-5, in which at least one of said first and second vehicles (4,16) is part of a vehicle train.
| 7. The friction monitoring system (2) according to any one of claims 1-6, in which at least one of said first and second vehicles (4,16) is an autonomous vehicle in a vehicle train.
| 8. The friction monitoring system (2) according to any one of the foregoing claims, in which said format for the processed slipperiness information signal (15) is suited to vehicle-to-vehicle transmission.
| 9. The friction monitoring system (2) according to any one of the foregoing claims, in which said format for the processed slipperiness information signal (15) is suited to vehicle-to-infrastructure transmission.
| 10. A method for a skid protection system for vehicles, which method comprises
* - making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle,
* - determining at least one friction value on the basis of the measurement,
* - generating a friction signal comprising said friction value determined,
* - receiving said friction signal in a processing device and generating a slipperiness information signal comprising said friction value,
* - receiving said slipperiness information signal in a communication device in said first vehicle,
* - sending a processed slipperiness information signal out wirelessly in a format such that one or more other vehicles can receive the signal,
* - processing the processed slipperiness information signal received and, where necessary, activating at least one skid protection system in said other vehicle on the basis of the information in the slipperiness information signal received, and
* - acting upon and, where necessary, activating at least one skid protection system of the first vehicle in accordance with a set of dynamic activation rules and characterised in that said set of dynamic activation rules comprises parameters related to nearby vehicles.
| 11. The method according to claim 10, in which said nearby vehicles are part of a vehicle train.
| 12. The method according to claim 11, in which said parameters comprise the length of the vehicle train.
| 13. The method according to any one of claims 10-12, comprises
* - determining the location of the first vehicle in a location measuring device,
* - determining a location value on the basis of the location determined,
* - generating a location signal on the basis of said location value,
* - receiving said location signal in said processing device,
* - generating said slipperiness information signal comprising coordinated friction values and location values.
| 14. The method according to claim 13, in which the processing device is adapted to relating each friction value to a location value so that a specific friction value unambiguously indicates how slippery the roadway is at a given location.
| 15. The method according to any one of claims 10-14, in which at least one of said first and second vehicles is part of a vehicle train.
| 16. The method according to any one of claims 10-15, in which at least one of said first and second vehicles is an autonomous vehicle in a vehicle train.
| 17. The method according to any one of claims 10-16, in which said format for the processed slipperiness information signal is suited to vehicle-to-vehicle transmission.
| 18. The method according to any one of claims 10-17, in which said format for the processed slipperiness information signal is suited to vehicle-to-infrastructure transmission.
| 19. A computer programme (P) for vehicles, which programme (D) comprises programme code for causing a processing device (10; 500) or another computer (500) connected to the processing device (10; 500) to perform steps of the method according to any one of claims 10-18.
| 20. A computer programme product comprising a programme code stored on a computer-readable medium for performing method steps according to any one of claims 10-18 when said programme code is run on a processing device (10; 500) or another computer (500) connected to the processing device (10; 500). | The friction monitoring system has a communication device (14) in a host vehicle (4) to receive a slipperiness information signal (12) and transmit wirelessly a processed slipperiness information signal (15) to other vehicles (16), activating the skid protection system (17) of the other vehicle. The slipperiness information signal is also used to active the skid protection system (22) of the host vehicle in accordance with set of dynamic activation rules. INDEPENDENT CLAIMS are included for the following:a method for a skid protection system for vehicles;a computer program for vehicles; anda computer program product. Friction monitoring system for vehicles, such as autonomous vehicles. Slipperiness information can be disseminated to other vehicles included in a vehicle train, improving traffic safety as well as safety of the vehicle. The drawing shows the block diagram of a friction monitoring system. 4Host vehicle12Slipperiness information signal14Communication device15Processed slipperiness information signal16Other vehicles17Skid protection system of other vehicles22Skid protection system of host vehicle |
Please summarize the input | Positioning quality filter for the V2X technologiesThis provides methods and systems for V2X applications, such as forward collision warning, electronic emergency brake light, left turn assist, work zone warning, signal phase timing, and others, mainly relying on a GNSS positioning solution transmitted via the Dedicated Short-Range Communications (DSRC) to/from the roadside units and onboard units in other V2X-enabled vehicles. However, the positioning solution from a GNSS may be deteriorated by noise and/or bias due to various error sources, e.g., time delay, atmospheric effect, ephemeris effect, and multipath effect. This offers a novel quality filter that can detect noise and the onset of drift in GNSS signals by evaluating up to four metrics that compare the qualities of kinematic variables, speed, heading angle change, curvature, and lateral displacement, obtained directly or derived from GNSS and onboard vehicle sensors. This is used for autonomous cars and vehicle safety, with various examples/variations.The invention claimed is:
| 1. A method for positioning quality filter for a global navigation system for a vehicle, said method comprising:
a central computer receiving global positioning system location data;
said central computer receiving sensors data from vehicle sensors;
said sensors data from said vehicle sensors comprises data from a vehicle speed sensor, a vehicle direction sensor, and a vehicle yaw rate sensor on said vehicle;
said central computer calculating a first metric value based on said sensors data from said vehicle sensors, based on said vehicle speed, said vehicle direction, and said vehicle yaw rate;
a processor receiving a first threshold;
said processor comparing said first metric value with said first threshold;
after determining that said first metric value is larger than or equal to said first threshold, said processor receiving a second threshold;
a) said central computer calculating a second metric value;
b) said processor comparing said second metric value with said second threshold;
c) after determining that said second metric value is smaller than said second threshold, said processor receiving a third threshold;
a. said central computer calculating a third metric value;
b. said processor comparing said third metric value with said third threshold;
c. after determining that said third metric value is larger than or equal to said third threshold, said processor receiving a fourth threshold;
a. said central computer calculating a fourth metric value;
b. said processor comparing said fourth metric value with said fourth threshold;
c. after determining that said fourth metric value is smaller than said fourth threshold, said processor setting said global navigation system value as valid;
said central computer validating said global positioning system location data 9 using said global navigation system value, for safety, operation, or navigation of said vehicle;
said central computer sending a notice to a vehicle warning device;
said central computer correcting a navigation of said vehicle; said central computer adjusting direction of said vehicle.
| 2. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, wherein said first threshold is not greater than 1.
| 3. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning an operator.
| 4. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning an driver.
| 5. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning headquarters.
| 6. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning a central server.
| 7. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning another driver.
| 8. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning another car.
| 9. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with pedestrians.
| 10. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with cloud.
| 11. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with server farms.
| 12. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with police.
| 13. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with a grid, a secured network, or outside car sensors.
| 14. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: resolving conflict between sensors and/or received data. | The positioning method involves use of processor for comparing primary metric value with primary threshold. The processor compares fourth metric value with fourth threshold. The processor sets the global navigation system value as valid in case fourth metric value is smaller than fourth threshold. The processor sets the global navigation system value as invalid when fourth metric value is larger than or equal to fourth threshold. A central computer validates global positioning system location data using global navigation system value, for safety, operation, or navigation of vehicle. An INDEPENDENT CLAIM is also included for a method for positioning quality filter for a positioning system for an automated or autonomous vehicle. Positioning method for quality filter of global navigation system for vehicle e.g. automated or autonomous vehicle. A weighted-averaging process based on the redundancies between coverage of different units, to weighted-average of the data for more accurate results, with more weights for the more reliable units or sources, or higher weights for the results that are closer to the center of curve representing the distribution of values, eliminating or reducing the fringe results or erroneous data. Such estimates and statistics for patterns or behaviors for people are very valuable for marketing and sales people who want to predict and plan ahead. The drawing shows a representation of development of fully automated vehicles, in stages. |
Please summarize the input | Methods and systems for V2X congestion control using directional antennas, and determining OBU transmission power based on the weather data received from vehicle CANSelf-driving and autonomous vehicles are very popular these days for scientific, technological, social, and economical reasons. In one aspect of this technology, one of the main concerns for an implementation of any V2X technology on a large scale is the issue of congestion control. In large cities and crowded highways during rush hours, each host vehicle can get messages from over 200 other vehicles and several road side units, all working on the same channel and trying to send and receive messages at the same time. With respect to the weather effect on signal, the signal path loss occurs whenever there is moderate (or moderate plus) rain, and because of that, the OBU communications packets are prone to get lost, or communication coverage region gets diminished, depending upon the intensity, speed, angle and temperature of the rainfall/snowfall droplets. We have provided the solutions for these 2 problems, with variations.The invention claimed is:
| 1. A method for an autonomous or automated vehicle operation, said method comprising:
a central computer receiving a vehicle state for said vehicle;
a communication channel for said vehicle transmitting in an omni-directional radiation pattern;
said communication channel for said vehicle receiving messages from all directions;
in case said vehicle state indicating a congestion on said communication channel for said vehicle, said communication channel for said vehicle changing to transmission in a specific directional pattern, and said communication channel for said vehicle continuing receiving messages from all directions.
| 2. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
in case channel congestion is detected, said communication channel for said vehicle changing to transmission in a directional pattern toward front and back of said vehicle, and said communication channel for said vehicle continuing receiving messages from all directions.
| 3. The method for an autonomous or automated vehicle operation, as recited in claim 2, said method comprises:
said communication channel for said vehicle switching between transmitting in an omni-directional radiation pattern and transmitting in a directional pattern toward said front and back of said vehicle, based on channel congestion.
| 4. The method for an autonomous or automated vehicle operation, as recited in claim 2, said method comprises:
said communication channel for said vehicle switching between transmitting in an omni-directional radiation pattern and transmitting in a directional pattern toward said front and back of said vehicle, based on driving speed.
| 5. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
monitoring said state for said vehicle, or
monitoring said congestion on said communication channel for said vehicle.
| 6. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
said communication channel for said vehicle switching to transmitting in an omni-directional radiation pattern.
| 7. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
monitoring a threshold value for said congestion on said communication channel for said vehicle.
| 8. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
monitoring a threshold value based on number of messages, or rate of messages for said congestion on said communication channel for said vehicle, or
monitoring a threshold value based on number of cars for said congestion on said communication channel for said vehicle.
| 9. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
monitoring a threshold value based on bandwidth capacity of said communication channel for said congestion on said communication channel for said vehicle.
| 10. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using multiple of directional transmission schemes or patterns.
| 11. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using map of a road for switching on transmission methods.
| 12. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using map and elevation data of a road for optimization of switching on transmission methods.
| 13. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using a combination of directional transmission and non-directional transmission simultaneously.
| 14. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using 2 of directional transmission schemes or patterns.
| 15. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using antenna arrays, a group of antennas, Cassegrain antenna, or parabolic antenna.
| 16. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using map and intersections data as potential danger points for a road for optimization of switching on transmission methods.
| 17. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
using a transition mode of transmission between and for optimization of switching on transmission methods.
| 18. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
independently optimizing transmission and listening modes or methods.
| 19. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
combining optimization of transmission and listening modes or methods.
| 20. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises:
optimizing of transmission based on multiple thresholds, conditions, or triggers. | The method involves comprising a central computer receiving a vehicle state for the vehicle. A communication channel is for the vehicle transmitting in an omni-directional radiation pattern. The communication channel for the vehicle receives messages from all directions. The communication channel for the vehicle changing to transmission in a specific directional pattern, and the communication channel for the vehicle continues receiving messages from all directions in case the vehicle state indicates congestion on the communication channel for the vehicle. Method for operation of autonomous or automated vehicle such as car, sedan, truck, bus, pickup truck, sport utility vehicle (SUV), tractor, agricultural machinery, entertainment vehicles, motorcycle, bike, bicycle, and hybrid. The overall number of messages each target gets is reduced and the channel congestion is avoided. The drawing shows a block diagram for a system with monitoring congestion with multiple modes of transmission. |
Please summarize the input | Automatic driving method and systemThe embodiment of the invention claims an automatic driving system, comprising: obtaining the image data of the peripheral vehicle of the automatic driving vehicle by an image processing unit, and processing the image data to obtain the first driving state information of the peripheral vehicle; communicating with the peripheral vehicle in the V2X communication distance by the V2X communication unit to obtain the second driving state information of the peripheral vehicle; through the decision unit, according to the first driving state information of the peripheral vehicle, or the first driving state information and the second driving state information, determining the driving action of the automatic driving vehicle, The invention solves the problem that the traffic safety is influenced by the wrong judgement of the surrounding vehicle caused by the special condition of the automatic driving vehicle in the related technology, and improves the automatic driving safety.|1. An automatic driving system, wherein it is set on the automatic driving vehicle, comprising: an image processing unit for obtaining the image data of the peripheral vehicle of the automatic driving vehicle and processing the image data to obtain the first driving state information of the peripheral vehicle; a vehicle network V2X communication unit for obtaining the second driving state information of the peripheral vehicle by communicating with the peripheral vehicle in the V2X communication distance; a decision unit for determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle or the first driving state information and the second driving state information.
| 2. The system according to claim 1, wherein the first driving state information of the peripheral vehicle comprises at least one of the following: vehicle profile information; vehicle orientation information.
| 3. The system according to claim 1, wherein the second driving state information of the peripheral vehicle comprises at least one of: vehicle identification information; vehicle latitude and longitude information; vehicle driving speed information; vehicle driving direction information.
| 4. The system according to claim 1, wherein the driving action of the automatic driving vehicle comprises at least one of the following: running at uniform speed; accelerating to drive; reducing speed to drive; Road change.
| 5. The system according to claim 1, wherein the decision unit further comprises: a distance outer decision sub-unit for calculating the driving speed and driving direction of the peripheral vehicle according to the first driving state information outside the V2X communication distance, and determining the driving action of the automatic driving vehicle according to the driving speed and driving direction of the peripheral vehicle.
| 6. The system according to claim 1, wherein the decision unit further comprises: a distance inner decision sub-unit, for in the V2X communication distance, judging whether the first driving state information and the second driving state information are the information of the same peripheral vehicle, if so, then according to the first driving state information and the second driving state information, determining the driving speed and driving direction of the peripheral vehicle, if not, then according to the second driving state information, determining the driving speed and driving direction of the peripheral vehicle, and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.
| 7. An automatic driving method, wherein it is used for automatic driving vehicle, comprising: obtaining the image data of the peripheral vehicle of the automatic driving vehicle, and processing the image data to obtain the first driving state information of the peripheral vehicle; obtaining second driving state information of the peripheral vehicle by communicating with the peripheral vehicle within a V2X communication distance; determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle, or the first driving state information and the second driving state information.
| 8. The method according to claim 7, wherein the step of determining the driving action of the automatic driving vehicle according to the first driving state information comprises: out of the V2X communication distance, calculating the driving speed and driving direction of the peripheral vehicle according to the first driving state information; and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.
| 9. The method according to claim 7, wherein the step of determining the driving action of the automatic driving vehicle according to the first driving state information and the second driving state information comprises: in the V2X communication distance, judging whether the first driving state information and the second driving state information are the information of the same peripheral vehicle, if so, determining the driving speed and driving direction of the peripheral vehicle according to the first driving state information and the second driving state information, if not, determining the driving speed and driving direction of the peripheral vehicle according to the second driving state information; and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.
| 10. A computer readable storage medium, wherein the computer readable storage medium is stored with a computer program, wherein the computer program is executed by a processor to realize the method according to any one of claims.
| 11. An electronic device, comprising a memory, a processor and a computer program stored on the memory and capable of being operated on the processor, wherein the method according to any one of claims is realized when the computer program is executed by the processor. | The system (20) has an image processing unit (210) that is provided for obtaining the image data of the peripheral vehicle of the automatic driving vehicle and processes the image data to obtain the first driving state information of the peripheral vehicle. A vehicle network vehicle to vehicle (V2V) communication unit (220) is provided for obtaining the second driving state information of the peripheral vehicle by communicating with the peripheral vehicle in the communication distance. A decision unit (230) is provided for determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle or the first driving state information and the second driving state information. INDEPENDENT CLAIMS are included for the following:an automatic driving method applied to an automatic driving vehicle;a computer readable storage medium storing program for providing the automatic driving of vehicle; andan electronic device. Automatic driving system for automatic driving vehicle. The system solves the defect that the peripheral vehicle state information in the existing technology is not accurate, and the different system modules are used at different vehicle distances to improve the precision of the obtained peripheral vehicle state information, so as to make the automatic driving vehicle to perform better reaction action, so as to achieve safer driving target. The drawing shows a block diagram of the automatic driving system. (Drawing includes non-English language text) 20Automatic driving system210Image processing unit220Vehicle to vehicle communication unit230Decision unit |
Please summarize the input | WAFER TRANFERING AUTOMATION SYSTEM AND OPEPRATING METHOD THEREOFThe wafer transport automation system according to the present invention loads at least one airtight container including a plurality of Front Opening Unified Pods (FOUPs) and performs vehicle-to-vehicle communication. An autonomous robot that transfers or loads at least one airtight container and moves the loaded airtight container so as to closely dock with a partition wall dividing a clean room and a non-clean room, and the at least one airtight container loaded on the autonomous robot An interface device that transfers each of a plurality of FOUPs between a container and a conveyor in the clean room, and a transport that performs wireless communication with the autonomous vehicles, the autonomous robot, and the interface device to perform wafer transport automation May contain automation servers.|1. Autonomous vehicles that load at least one airtight container including a plurality of Front Opening Unified Pods (FOUPs) and perform vehicle-to-vehicle communication;
an autonomous robot that transfers or loads the at least one airtight container from each of the autonomous vehicles and moves the loaded airtight container so that the airtight container is closely docked with a bulkhead dividing a clean room and a non-clean room;
an interface device for transferring each of a plurality of FOUPs between the at least one airtight container loaded in the autonomous robot and the conveyor of the clean room; and a transfer automation server that performs wireless communication with the self-driving vehicles, the self-driving robot, and the interface device to perform wafer transfer automation.
| 2. The wafer transport automation system according to claim 1, wherein the at least one sealing container includes a sealing unit performing an internal/external blocking function.
| 3. The wafer transport automation system of claim 1, wherein the at least one sealed container includes at least one damped vibration unit performing a low/no vibration function.
| 4. The wafer transport automation system according to claim 1, wherein the at least one airtight container inputs and outputs nitrogen gas to maintain constant temperature/humidity/low moisture.
| 5. The wafer transport automation system according to claim 1, wherein the at least one sealed container includes a Radio Frequency Identification (RFID) tag for computerized lot management.
| 6. The method of claim 1, wherein each of the self-driving vehicles performs a function of automatically opening and closing a loading box door for inputting/dispensing of the airtight container, maintaining a constant temperature/humidity inside the vehicle body, or performing a function of maintaining a constant temperature/humidity inside the vehicle body while driving. A wafer transfer automation system characterized in that it performs a low-vibration / no-vibration maintenance function.
| 7. The wafer transport automation system according to claim 1, wherein each of the autonomous vehicles performs parallel input output (PIO) communication with an autonomous robot or wireless communication with the transport automation server.
| 8. The method of claim 1, wherein the self-driving robot supplies nitrogen gas to the at least one airtight container, performs PIO (Parallel Input Output) communication with each of the self-driving vehicles, or communicates with the transfer automation server and the wireless A wafer transfer automation system characterized in that it performs communication.
| 9. The method of claim 1, wherein the interface device opens and closes a door of the bulkhead, opens and closes a door of the at least one airtight container, performs PIO (Parallel Input Output) communication with the autonomous robot, or the transfer automation server. And the wafer transfer automation system, characterized in that performing the wireless communication.
| 10. A method of operating a wafer transport automation system, comprising: moving an airtight container including a plurality of Front Opening Unified Pods (FOUPs) from a first factory using an autonomous vehicle;
transferring the airtight container of the self-driving vehicle to the self-driving robot;
closely docking the airtight container to a partition wall between a clean room and a non-clean room of a second factory;
opening the airtight container by an interface device and dispensing each of the plurality of FOUPs to a conveyor of the clean room; and transferring the ejected FOUP to the conveyor. | The system has an airtight container (130) including multiple front opening unified pods (FOUPs) for performing vehicle-to-vehicle communication. An autonomous robot (200) transfers or loads the container from each of autonomous vehicles (100) and moves the loaded container such that the container is closely docked with a bulkhead dividing a clean room and a non-clean room. An interface device (300) transfers each of the FOUPs between the container loaded in the autonomous robot and a conveyor of the clean room. A transfer automation server (400) performs wireless communication with the autonomous vehicles. An INDEPENDENT CLAIM is also included for a method for operating an automation wafer transfer system for use during semiconductor manufacturing process. Automation wafer transfer system for use during semiconductor manufacturing process. The system achieves complete automation of wafer transport by transferring an airtight container having wafer between factories through an autonomous vehicle between factories, reduces the transport waiting time by eliminating manual work outside logistics, and minimizes the safety hazards to workers. The drawing shows a schematic view of an automation wafer transfer system for use during semiconductor manufacturing process.(Drawing includes non-English language text).100Autonomous vehicle130Airtight container200Autonomous robot300Interface device400Transfer automation server |
Please summarize the input | Electronic device for supporting wireless mobile communication for vehicle and operation method of the sameProvided are an electronic device for supporting vehicle-to-everything (V2X) communication on which an autonomous driving vehicle technology, a cooperative-intelligent transport systems (C-ITS) technology, etc. are based and an operation method of the electric device. The electronic device mounted to a vehicle to support wireless mobile communication for the vehicle includes: a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology; a cellular V2X (C-V2X) module configured to perform wireless communication by using C-V2X technology; an antenna; and a processor configured to control a switch to connect one of the DSRC module and the C-V2X module to the antenna.What is claimed is:
| 1. An electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the electronic device comprising:
a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology;
a cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology;
an antenna;
a switch; and
a processor configured to control the switch to connect the DSRC module or the C-V2X module to the antenna,
wherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and
wherein the switch is configured to receive two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas and output two signals to the pair of sub-antennas.
| 2. The electronic device of claim 1, further comprising a telematics control unit (TCU).
| 3. The electronic device of claim 1, wherein each of the DSRC module and the C-V2X module includes a V2X modem and a radio frequency (RF) transceiver.
| 4. The electronic device of claim 1, wherein the processor is further configured to:
select a module from the DSRC module and the C-V2X module based on location information of the vehicle; and
control the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna.
| 5. The electronic device of claim 4, wherein the location information of the vehicle includes a global positioning system (GPS) signal for the vehicle,
wherein the module selected from the DSRC module and the C-V2X module is selected based on information about a V2X communication technology corresponding to a location of the vehicle and the location information of the vehicle, and
wherein the module selected from the DSRC module and the C-V2X module supports a V2X communication technology corresponding to a current location of the vehicle.
| 6. The electronic device of claim 1, wherein the processor is further configured to:
select a module from the DSRC module and the C-V2X module based on information about a base station that is performing cellular communication with the electronic device; and
control the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna.
| 7. The electronic device of claim 6, wherein the module selected from the DSRC module and the C-V2X module is selected based on information about a V2X communication technology corresponding to the base station and the information about the base station, and
wherein the module selected from the DSRC module and the C-V2X module supports the V2X communication technology corresponding to the base station.
| 8. The electronic device of claim 1, wherein the processor is further configured to:
select a module from the DSRC module and the C-V2X module by periodically comparing a DSRC signal received via the DSRC module with a C-V2X signal received via the C-V2X module; and
control the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna.
| 9. The electronic device of claim 8, wherein the module selected from the DSRC module and the C-V2X module is selected based on comparing packet error rate (PER), packet reception rate (PRR), latency, and/or strength of the DSRC signal and the C-V2X signal to each other.
| 10. The electronic device of claim 9, wherein the processor is further configured to:
control the switch to perform diversity communication by using one of the DSRC module and the C-V2X module and the pair of sub-antennas included in the antenna;
control the switch to receive the DSRC signal via a first sub-antenna of the pair of sub-antennas and the C-V2X signal via a second sub-antenna at preset time periods; and
determine whether to change V2X communication technology based on a result of the comparing the DSRC signal with the C-V2X signal.
| 11. The electronic device of claim 1, wherein the processor is further configured to:
obtain surrounding environment information and select a module from the DSRC module and the C-V2X module based on the obtained surrounding environment information; and
control the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna.
| 12. The electronic device of claim 1, wherein the processor is further configured to:
obtain a captured image of the vehicle's surrounding environment as surrounding environment information; and
identify, in the captured image, an entity supporting vehicle-to-infrastructure (V2I) communication with the vehicle and select a module from the DSRC module and the C-V2X module as a module supporting a V2X communication technology corresponding to the identified entity.
| 13. An operation method of an electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the operation method comprising:
selecting a module from a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology and a cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology;
controlling a switch to connect the selected module to an antenna; and
performing V2X communication via the selected module,
wherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and
wherein the method further comprises:
receiving two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas; and
outputting two signals to the pair of sub-antennas.
| 14. The operation method of claim 13, wherein location information of the vehicle includes a global positioning system (GPS) signal for the vehicle,
wherein the selecting of the module further comprises:
obtaining the location information of the vehicle; and
selecting the module from the DSRC module and the C-V2X module based on information about a V2X communication technology corresponding to a location of the vehicle and the location information of the vehicle, and
where the module selected from the DSRC module and the C-V2X module supports a V2X communication technology corresponding to a current location of the vehicle.
| 15. The operation method of claim 13, wherein the selecting of the module further comprises:
obtaining information about a base station that is performing cellular communication with the electronic device; and
selecting the module from the DSRC module and the C-V2X module based on information about a V2X communication technology corresponding to the base station and the information about the base station,
wherein the module selected from the DSRC module and the C-V2X module supports the V2X communication technology corresponding to the base station.
| 16. The operation method of claim 13, wherein the module selected from the DSRC module and the C-V2X module is selected based on periodically comparing a DSRC signal received via the DSRC module with a C-V2X signal received via the C-V2X module.
| 17. The operation method of claim 13, wherein the selecting of the module further comprises:
obtaining surrounding environment information of the vehicle; and
selecting the module from the DSRC module and the C-V2X module based on the obtained surrounding environment information.
| 18. A non-transitory computer-readable recording medium having stored therein a program for performing an operation method of an electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the operation method comprising:
selecting a module from a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology and a cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology;
controlling a switch to connect the selected module to an antenna; and
performing V2X communication via the selected module,
wherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and
wherein the operation method further comprises:
receiving two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas; and
outputting two signals to the pair of sub-antennas. | The electronic device comprises a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology. A cellular vehicle-to-everything (C-V2X) module is configured to perform wireless communication by using C-V2X technology. A processor is provided to control a switch to connect the DSRC module or the C-V2X module to the antenna. The DSRC module and the C-V2X module include a V2X modem and a radio frequency (RF) transceiver. The antenna includes a pair of sub-antennas for diversity transmission and diversity reception. A module is selected from the DSRC module and the C-V2X module based on location information of the vehicle (30,100). INDEPENDENT CLAIMS are included for the following:a method for an electronic device mounted for supporting wireless mobile communication for vehicle through wired or wireless networks; anda computer-readable recording medium having stored instructions for implementing the method for supporting wireless mobile communication for vehicle through wired or wireless networks. Electronic device for supporting wireless mobile communication for vehicle, such as car through wired or wireless networks, such as third generation , fourth generation and fifth generation networks. Electronic device can select the V2X module, which achieves better performance by comparing packet error rate (PER), packet reception rate (PRR), latency, or the strength of the DSRC signal and the C-V2X signal. Multiple weight values can be modified to reduce or minimize a loss or cost value obtained by the AI model during a training process. The drawing shows a perspective view of electronic device for supporting wireless mobile communication for vehicle. 10Network20Infrastructure30,100Vehicle40Pedestrian |
Please summarize the input | Method and apparatus for operating autonomous driving controller of vehicleProvided is a method and apparatus for operating an autonomous driving controller, the method including generating route information for the vehicle based on a rule, transitioning from an autonomous driving mode to an autonomous driving disable mode, in response to the driving route information not being generated for an amount of time greater than or equal to a threshold, tracking at least one neighboring vehicle based on data sensed by a sensor, and generating temporary driving route information based on a movement of the at least one neighboring vehicle.What is claimed is:
| 1. A method of driving a vehicle, the method comprising:
generating driving route information for the vehicle based on a rule; and
in response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a threshold:
disabling an autonomous driving mode using the driving route information;
tracking at least one neighboring vehicle based on data sensed by a sensor;
generating temporary driving route information based on a movement of the at least one neighboring vehicle; and
driving the vehicle based on the temporary driving route information.
| 2. The method of claim 1, wherein the generating of the driving route information comprises:
recognizing a surrounding environment of the vehicle based on the data sensed by the sensor; and
generating the driving route information based on the recognized surrounding environment and the rule.
| 3. The method of claim 1, wherein the tracking of the movement of the at least one neighboring vehicle comprises:
periodically determining a location of the at least one neighboring vehicle; and
tracking the movement of the at least one neighboring vehicle based on a change of the location of the at least one neighboring vehicle.
| 4. The method of claim 1, wherein the generating of the temporary driving route information comprises:
determining a difference in movement between a first neighboring vehicle and a second neighboring vehicle, in response to movements of two neighboring vehicles being tracked; and
generating the temporary driving route information based on the difference in movement between the first neighboring vehicle and the second neighboring vehicle.
| 5. The method of claim 4, wherein the determining of the difference comprises determining the difference by comparing a first surrounding environment in which the first neighboring vehicle moves to a second surrounding environment in which the second neighboring vehicle does not move, and
the generating of the temporary driving route information comprises generating the temporary driving route information based on the first surrounding environment including the difference.
| 6. The method of claim 1, wherein the generating of the temporary driving route information comprises generating the temporary driving route information to move the vehicle based on a change of a location of the at least one neighboring vehicle.
| 7. The method of claim 1, wherein the at least one neighboring vehicle comprises a vehicle moving in a direction identical to a direction of the vehicle.
| 8. The method of claim 1, wherein the at least one sensor comprises any one or any combination of a camera, a lidar, and a radar.
| 9. The method of claim 1, wherein the generating of the temporary driving route information comprises generating the temporary driving route information in response 25 to an absence of a movement of a vehicle in a direction different from a direction of the vehicle.
| 10. The method of claim 1, further comprising, in response to the driving route information not being generated based on the rule for the amount of time greater than or equal to the threshold, updating log information stored in a memory.
| 11. The method of claim 10, further comprising:
searching the memory for the log information corresponding to a current circumstance, in response to a route generating mode being transitioned to an autonomous driving disable mode; and
generating the temporary driving route information based on the log information corresponding to the current circumstance.
| 12. The method of claim 11, wherein the current circumstance comprises any one or any combination of a type of an obstacle, a size of the obstacle, weather conditions, type of a road, road conditions, a size of a lane, and a number of lanes, and
the searching comprises searching the memory for the log information having greatest similarity with the current circumstance.
| 13. The method of claim 1, further comprising:
receiving the temporary driving route information through wireless communication or vehicle to vehicle (V2V) communication with the at least one neighboring vehicle, in
response to a route generating mode being transitioned to an autonomous driving disable mode.
| 14. A non-transitory computer-readable medium storing instructions that, when executed by a processor, causes the processor to perform the method of claim 1.
| 15. An autonomous driving controller comprising:
a memory configured to store instructions; and
a processor configured to execute the instructions to generate route information for a vehicle based on a rule, and in response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a threshold, configured to transition from an autonomous driving mode to an autonomous driving disable mode, track at least one neighboring vehicle based on data sensed by a sensor, generate temporary driving route information based on a movement of the at least one neighboring vehicle and control the vehicle based on the temporary driving route information.
| 16. A method of controlling autonomous driving controller, the method comprising:
generating route information for a vehicle based on a rule;
in response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a preset time:
transitioning from an autonomous driving mode to an autonomous driving disable mode;
searching a memory for log information having greatest similarity with a current circumstance, in response to the transitioning to the autonomous driving disable mode;
generating temporary driving route information based on the log information; and
controlling the vehicle based on the temporary driving route information.
| 17. A method of driving a vehicle, the method comprising:
generating route information for the vehicle being driven in an autonomous driving mode; and
in response to the route information violating a rule for a time period greater than a threshold:
disabling the autonomous driving mode;
tracking a change of a location of at least one neighboring vehicle based on data sensed by a sensor;
generating temporary driving route information based on the change of the location of the at least one neighboring vehicle; and
driving the vehicle based on the temporary driving route information.
| 18. The method of claim 17, wherein the generating of the temporary driving route information comprises generating a temporary driving route based on the change of the location of the at least one neighboring vehicle and a surrounding environment in which the at least one neighboring vehicle moves.
| 19. The method of claim 17, further comprising:
updating log data, in a memory, in response to the generating of the temporary driving route information.
| 20. The method of claim 19, wherein the generating of the temporary driving route information comprises generating a temporary driving route based on the change of the location of the at least one neighboring vehicle and log entry, stored in the memory, corresponding to a surrounding environment of the vehicle. | The method involves generating (710) the route information for the vehicle based on a rule. A neighboring vehicle is tracked based on the data sensed by a sensor. The temporary driving route information is generated (760) based on a movement of the neighboring vehicle. A surrounding environment of the vehicle is recognized based on the data sensed by the sensor. The sensor is combination of a camera, light detection and ranging and a radar. A memory is searched for the log information corresponding to a current circumstance. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium for storing instructions; andan autonomous driving controller has a memory. Method for driving a vehicle. The method involves generating the route information for the vehicle based on a rule, where neighboring vehicle is tracked based on the data sensed by a sensor and the temporary driving route information is generated based on a movement of the neighboring vehicle, and thus enables to recognize surrounding objects and generates a driving route that meets traffic regulations and avoids contact with the surrounding objects. The drawing shows a flowchart of a method for operating an autonomous driving controller. 710Generating the route information for the vehicle720Verifying whether the driving route information is generated based on the rule730Transitioning a route generating mode from an autonomous driving mode to an autonomous driving disable mode740Tracking the movement of a neighboring vehicle based on the data sensed by the sensor760Generating the temporary driving route information |