Positioning An Unmanned Robotic Vehicle

FIELD OF THE INVENTION

- Top of Page


The present invention relates generally to unmanned robotic vehicles, and, in particular, to positioning an unmanned robotic vehicle in proximity to an object.

BACKGROUND

- Top of Page


OF THE INVENTION

There is no such thing as a routine traffic stop, as public safety officers continue to get hurt or killed while conducting them.. For example, the Federal Bureau of Investigation's (FBI) Law Enforcement Officers Killed and Assaulted (LEOKA) statistics estimate that between 2001 and 2010, 95 public safety officers died during traffic stops and 4,752 were assaulted.

A traffic stop is always an at-risk situation because one never knows who is in the vehicle and what the intentions are of the vehicle's occupants. Further, one also does not know what objects, such as guns, may be inside the vehicle and how any such objects can be used against a public safety officer. Additionally, in high-risk traffic stops, such as may result from pursuit of a vehicle believed to have been involved in a felony crime, approaching the vehicle may pose an extreme risk to the public safety officer. With advanced artificial intelligence, machine learning, and robotics, some of these risks may be mitigated by allowing some actions to be undertaken by an unmanned robotic vehicle.

BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.

FIG. 1 is a block diagram of a wireless communication system in accordance with some embodiments of the present invention.

FIG. 2 is a block diagram of an unmanned robotic vehicle of the communication system of FIG. 1 in accordance with some embodiments of the present invention.

FIG. 3 is a block diagram of a server of the communication system of FIG. 1 in accordance with some embodiments of the present invention.

FIG. 4A is a logic flow diagram illustrating a method executed by the communication system of FIG. 1 in positioning an unmanned robotic vehicle in accordance with some embodiments of the present invention.

FIG. 4B is a continuation of the logic flow diagram of FIG. 4A illustrating a method executed by the communication system of FIG. 1 in positioning an unmanned robotic vehicle in accordance with some embodiments of the present invention.

One of ordinary skill in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various embodiments of the present invention. Also, common and well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required.

DETAILED DESCRIPTION

- Top of Page


OF THE INVENTION

A method and apparatus are provided for positioning an unmanned robotic vehicle (URV). The URV captures a set of one or more of image and non-image information, that is, a set of image and/or non-image information, associated with an object of interest while positioned at a first position, provides the set of image and/or non-image information to a server entity, in response to providing the set of image and/or non-image information, receives a three-dimensional (3D) model associated with the object, autonomously determines a second position based on the 3D model, and autonomously navigates to the second position. At the second position, the URV may capture further image and/or non-image information and, based on the further captured image and/or non-image information, autonomously determine, and navigate to, a third position. The steps of capturing further image and/or non-image information and, based on the captured image and/or non-image information, autonomously determining, and navigating to, further positions may be repeated indefinitely, or until otherwise instructed. Thus, the information collected by the URV and information returned to the URV by the server entity may allow a public safety responder to better assess a situation with respect to the object and determine follow up actions while minimizing personal risk to the public safety responder, such as calling for back up without having to first approach the vehicle on foot.

Generally, an embodiment of the present invention encompasses a method for positioning a URV. The method includes capturing, by the URV, a set of one or more of image and non-image information associated with an object while positioned at a first position; providing, by the URV, the set of one or more of image and non-image information to a server entity; in response to providing the set of one or more of image and non-image information, receiving, by the URV, a three-dimensional model associated with the object; autonomously determining, by the URV, a second position based on the three-dimensional model; and autonomously navigating the URV to the second position.

Another embodiment of the present invention encompasses a URV that includes a processor, a propulsion system, one or more wireless interfaces, one or more sensors, an image capturing device, and an at least one memory device. The at least one memory device is configured to store a set of instructions that, when executed by the processor, cause the processor to perform the following functions: capture, via one or more of the image capture device and the one or more sensors, a set of one or more of image and non-image information associated with an object while positioned at a first position; provide the set of one or more of image and non-image information to a server entity; in response to providing the set of one or more of image and non-image information, receive a three-dimensional model associated with the object; autonomously determine a second position based on the three-dimensional model; and autonomously navigate, via the propulsion system, the URV to the second position.

The present invention may be more fully described with reference to FIGS. 1-4B. FIG. 1 is a block diagram of a wireless communication system 100 in accordance with some embodiments of the present invention. Communication system 100 includes an incident scene 110 comprising an object or vehicle of interest 102. Communication system 100 further includes a mobile unmanned robotic vehicle (URV) 104, such as an unmanned ground vehicle (UGV) or an unmanned aerial vehicle (UAV) (depicted in FIG. 1 as a UAV), for example, a drone or a flying remote-controlled robot, that is equipped with various sensors and data capture devices for acquiring data and images with respect to object/vehicle 102. Communication system 100 further includes one or more public safety responders 106, 108, such as a public safety responder 106 on foot or as a public safety responder 108 located inside a public safety response vehicle 109, for example, a police car, a fire truck, a hazardous materials response vehicle, or a command VAN, which public safety responders 106, 108 and URV 104 are located at or near incident scene 110 and are in wireless communication with the URV.

Communication system 100 further includes a wireless communication network 120 that provides wireless services to communication devices within a coverage area of the wireless communication network, such as URV 104 and mobile devices of public safety responders 106, 108. Communication system 100 further includes a public safety agency 140, such as a police department or a fire department, that is in communication with wireless communication network 120 via a data network 130, such as the Internet and/or an enterprise or public safety agency network, and, via the wireless communication network and data network, with URV 104 and public safety responders 106 and 108. In some embodiments of the present invention, the public safety response vehicle 109 of public safety responder 108 may include a digital vehicular repeater system (DVRS) capable of relaying communications between wireless communications network 120 and each of URV 104 and public safety responders 106 and 108 and/or may include ultrasonic and/or ultra-wideband transmitter/receiver circuitry capable of engaging in wireless communications with ultrasonic and/or ultra-wideband transmitter/receiver circuitry of URV 104. Collectively, wireless communications network 120, data network 130, and public safety agency 140 are an infrastructure of communication system 100 and elements of the wireless communications network, data network, and public safety agency may be referred to as infrastructure elements of communication system 100.

Wireless communication network 120 includes a radio access network (RAN) comprising one or more wireless access nodes (not shown), such as an access point, a base station, and an eNodeB, which RAN is in communication with a core network (not shown). Wireless communication network 120 may operate in accordance with any wireless communication technology that supports data applications. For example, wireless communication network 120 may be a public safety (PS) network that can utilize, for example, Long Term Evolution (LTE), Enhanced Voice-Data Optimized (EVDO), IEEE 802.11 and variants thereof ("Wi-Fi"), Project 25 (P25), Digital Mobile Radio (DMR), Land Mobile Radio (DMR), Terrestrial Trunked Radio (TETRA), etc.

Public safety agency 140 includes an infrastructure-based server entity, or server, 142 that implements an image processing system and one or more databases 144-147 (four shown) accessible by the server and that are repositories of public safety content; however, in other embodiments of the present invention, server 142 and the one or more public safety content databases 144-147 may be located at public safety response vehicle 109 and may be accessible via the DVRS of the public safety response vehicle. The functions described herein as performed by server 142 and the one or more public safety content databases 144-147 are not specific to where the server and databases are located; that is, instead of or in addition to infrastructure-based servers and databases, the server and databases may be vehicle-based, that is, located in public safety response vehicle 109, or URV-based, that is, located in URV 104, or the functions described herein as performed by server 142 and the one or more public safety content databases 144-147 may be distributed among the public safety response vehicle 109, the infrastructure of public safety agency 140, and/or URV 104, for example, server 142 may be located in URV 104 or public safety response vehicle 109 and the one or more databases 144-147 respectively may be located in public safety response vehicle 109 or the infrastructure of the public safety agency, or server 142 and one or more of public safety content databases 144-147 may be located in the public safety response vehicle 109 and other databases of the one or more public safety content databases 144-147 may be located in the infrastructure of the public safety agency.

A first database 144 of the one or more databases 144-147 maintains three-dimensional (3D) models of objects or vehicles that may be of interest. For example, database 144 may be a vehicle make and model database that maintains vehicle make and model information and two-dimensional (2D) and 3D images of various vehicle makes and models. In response to a query comprising identification information of a vehicle, such as an image of a vehicle captured by URV 104, database 144 identifies the vehicle in the image and returns vehicle make and model information as well as a 3D image of the vehicle. In other embodiments of the present invention, database 144 may be a weapons database that, in response to a query comprising identification information of a weapon, such as an image of a gun or an explosive device captured by URV 104, identifies the weapon in the image and returns weapon make and model information as well as a 3D image of the weapon. Other databases of the one or more databases 144-147 maintain further public safety-related information that may be used by URV 104 and/or public safety responders 106 and 108 to obtain public safety information that facilitates a determination of follow-up actions to be undertaken by URV 104 or the public safety responders. For example, a second database 145 of the multiple databases 144-147 may be a vehicle registration database, for example, a Department of Motor Vehicles (DMV) database, that maintains vehicle registration information and driver\'s license information, such as license plate information (for example, license plate numbers and states) that is stored in association with information for an individual(s) to whom each such license plate is registered, such as the driver\'s license information for such an individual. By way of further examples, a third database 146 of the multiple databases 144-147 may be a weapons (in the event that database 144 is other than a weapons database), explosive device, and/or chemical material database (such as a hazardous materials (hazmat) database and/or an illegal substance database) that maintains images of various weapons and explosive devices, a make and model information associated with each such weapon/explosive devices, and/or chemical identification and potential use information, and a fourth database 147 of the multiple databases 144-147 may be a criminal records database, such as a Records Management Service (RMS) database as known in the art. In some embodiments of the present invention, one or more of the multiple databases 144-147 may be included in server 142.

Server 142 is designed to allow full use of incident scene 110 images captured by URV 104, including an ability to analyze the operational characteristics of incident scene 110 and an ability to analyze images of objects, vehicles, vehicle license plates, vehicle occupants, and detected weapons. More particularly, server 142 includes a server entity that may collect, process, and store data in a database of the server and one or more search engines that may search the one or more databases of communication system 100 in response to receiving an image from URV 104 and/or public safety responders 106 and 108. In other embodiments of the present invention, one or more of the search engines may be external to, and in communication with, server 142, for example, may be included in one or more of public safety content databases 144-147. Server 142 is connected to data network 130 via any of a wireless, wireline, or optical connection, or any other connection known in the art.

Referring now to FIG. 2, a block diagram of URV 104 is provided in accordance with some embodiments of the present invention. As shown, URV 104 generally includes a processor 202, at least one memory device 204, an image capture device 210, one or more sensors 212, a propulsion system 214, one or more input/output (I/O) interfaces 216, a location detector 222, and one or more wireless interfaces 230. Optionally, URV 104 further may include a server entity 224 and a database 226 accessible by the server entity, which server entity and database perform functionality similar to, and store data similar to, the functionality and data described herein as being performed by, and stored by, server 142 and public safety content databases 144-147. Additionally, URV 104 optionally may include a telescopic arm 228. The components (202, 204, 210, 212, 214, 216, 222, 224, 226, 228, 230) of URV 104 are communicatively coupled via a local interface 232. Local interface 232 can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art. Local interface 232 can have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers, among many others, to enable communications. Further, local interface 232 may include address, control, and/or data connections to enable appropriate communications among the aforementioned components. It should be appreciated by those of ordinary skill in the art that FIG. 2 depicts URV 104 in an oversimplified manner, and a practical embodiment may include additional components and suitably configured processing logic to support known or conventional operating features that are not described in detail herein.

URV 104 operates under the control of processor 202, such as one or more microprocessors, microcontrollers, digital signal processors (DSPs), combinations thereof or such other devices known to those having ordinary skill in the art. Processor 202 operates the corresponding communication device according to data and instructions stored in the at least one memory device 204, such as random access memory (RAM), dynamic random access memory (DRAM), and/or read only memory (ROM) or equivalents thereof, that stores data and instructions that may be executed by the corresponding processor so that the communication device may perform the functions described herein.

The data and instructions maintained by at least one memory device 204 include software programs that include an ordered listing of executable instructions for implementing logical functions. For example, the software in at least one memory device 204 includes a suitable operating system (O/S) and programs. The operating system essentially controls the execution of other computer programs, and provides scheduling, input-output control, file and data management, memory management, and communication control and related service. The programs may include various applications, add-ons, etc. configured to provide user functionality with URV 104.

At least one memory device 204 further maintains a motion planning module 206, a collision avoidance module 207, and an automated license plate recognition (ALPR) module 208. Motion planning module 206 comprises a set of data and instructions that, when executed by processor 202, generates motion instructions for propulsion system 214. That is, motion planning module 206 is able to issue motion instructions to propulsion system 214 based on sensor readings, motion instruction received from public safety agency 140, motion correction provided by collision avoidance module 207, and coarse location of the object/vehicle given by an operator of URV 104. When executing a task, motion planning module 206 may continuously determine current location information and provide motion instructions to propulsion system 214. Collision avoidance module 207 comprises a set of data and instructions that, when executed by processor 202, detects and avoids collisions with objects/vehicles. Collision avoidance module 207 utilizes image capture device 210 and the one or more sensors 212 to detect, and avoid collisions with, objects/vehicles. For example, in response to detecting an object or vehicle, collision avoidance module 207 may generate collision avoidance instructions that may be provided to motion planning module 206 and/or propulsion system 214 to prevent a collision with the detected object or vehicle. ALPR module 208 comprises image processing technology that transforms a license plate image captured by image capture device 210 into a text of the license plate state and number, thereby facilitating a search of a vehicle registration database, such as database 145, or an RMS database, such as database 147, for vehicle registration or criminal information associated with the imaged license plate.

Image capture device 210, such as a camera or a video recording device, and the one or more sensors 212 may be used to generate images, such as 2D and 3D images, of an object or vehicle of interest, such as object vehicle 102, and further to determine location information that allow for URV 104 to measure its position with respect to, and its distance from, an object or vehicle. The one or more sensors 212 may include such sensors as a laser scanner and/or ultra-sonic range finder, a compass, an altimeter, an accelerometer, and other sensors known in the art that may allow URV 104 to determine a relative location, orientation, and proximity of an object or vehicle, such as object/vehicle 102. These sensors are used by motion planning module 206 and collision avoidance module 207 in order to determine a movement direction and a destination for URV 104. Additionally, by combining information collected by image capture device 210 and the one or more sensors 212, such as the laser scanner and/or ultra-sonic range finder, URV 104 is able to generate a 3D image, from at least a side view, of an object or vehicle. Further, the one or more sensors 212 may include environmental condition detection sensors that detect various physical conditions of the environment around URV 104. For example the, such environmental condition detection sensors may include an electronic nose sensor that detects and analyzes substances in the air around URV 104 or a portable substance analyzer, such as a narcotics analyzer, capable of analyzing a physical substance, for example, retrieved by telescopic arm 228, for its chemical makeup.

Propulsion system 214 comprises multiple physical motion generation devices, such as motors, gears, belts, and wheels in the event of a UGV, or one or more propeller systems each comprising one or more propellers and associated propeller circuitry, and/or one or more duct engines, such as jet engines, in the event of a UAV. In response to receiving instructions from processor 202, propulsion system 214 generates a physical movement of URV 104, including a maintaining of a current airborne position of a URV 104 in the event that it is a UAV. For example, as depicted in FIG. 1, propulsion system 214 comprises four propeller systems, each propeller system located at one of four corners of URV 104, that are capable of moving the URV up and down and in any lateral direction.

Motion planning module 206 and collision avoidance module 207 are used to make motion adjustments to properly position URV 104. More particularly, appropriate motion instructions are sent to propulsion system 214 in order to properly position URV 104. In doing so, collision avoidance module 207 may take precedence and may override any instructions from motion planning module 206. Thus, during operation, motion planning module 206 may generate instructions for propulsion system 214 to execute a particular route through an area as part of the execution of a task or to position itself in a particular location and/or orientation with respect to object/vehicle 102. That is, processor 202 may utilize motion planning module 206 to properly navigate the URV to a specific position in relation to the detected object/vehicle, at which position processor 202 may utilize image capture device 210 and sensors 212 to locate, scan, and collect images of the detected object/vehicle.

The one or more I/O interfaces 216 include user interfaces that allow a person to input information in, and receive information from, URV 104. For example, the user interfaces may include a keypad, a touch screen, a scroll ball, a scroll bar, buttons, bar code scanner, and the like. Further, the user interfaces may include a display screen, such as a liquid crystal display (LCD), touch screen, and the like for displaying system output. Additionally, the user interfaces may include a microphone 218 via which a person can input audio into the URV, and a speaker 220 via which the person can receive audio from the URV. I/O interfaces 216 also can include, for example, a serial port, a parallel port, a small computer system interface (SCSI), an infrared (IR) interface, a universal serial bus (USB) interface, and the like for communicating with, or coupling to, an external device.

Location detector 222 determines a geographical location of URV 104. Location detector 222 may comprise, for example, a GPS receiver and/or may comprise circuitry, for example, one or more antennas and a microprocessor, such as being implemented by processor 202, by which URV 104 may receive signals from multiple base stations and determine its location based on the received signals, such as based on a time differences of arrival (TDOA) among such signals and/or triangulation. In still other exemplary embodiments of location detector 222, URV 104 may transmit, via the one or more wireless interfaces 230, a signal to each of multiple base stations, which may in turn determine a location of the URV based on time differences of arrival (TDOA) among the signals received at each such base station and/or triangulation and then one or more of the base stations may transmit the determined location back to the URV. Based on the signals received from the one or more base stations, location detector 222 then determines the location of the URV. In yet other embodiments of the present invention, location detector 222 may include ultrasonic and/or ultra-wideband transmitter/receiver circuitry capable of engaging in wireless communications with ultrasonic and/or ultra-wideband transmitter/receiver circuitry of one or more nearby public safety response vehicles, such as public safety response vehicles 109. Based on such communications, a processor of the location detector or processor 202 may determine a location of the URV as known in the art.



SHARE THIS
Previous Post
Next Post