hash
stringlengths 32
32
| doc_id
stringlengths 5
12
| section
stringlengths 4
595
| content
stringlengths 0
6.67M
|
---|---|---|---|
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.1 Description | Sensing has been considered in this technical report in terms of interaction between a UE and a base station. This information however is only partial, as it extends along a limited UE-base station axis. This information however can be considered a component of a scene that, when gathered with other available sensor data, can be synthesized into more comprehensive information.
Where the sensor is video, LiDAR, sonar, etc., (that is, it operates in some other way than 3GPP defined radio access technology,) it is still valuable to gather simultaneous sensor data and combine it. Only this way can sensor data capturing a scene (such as the front and back and sides of an object of interest, etc.) be obtained.
In the Localized Mobile Metaverse Services use case 5.1 in [11], includes the following text "His mobile device begins to collect information about his surroundings. The collected information can include information that is obtained by interacting with nearby devices (e.g. sensors and other mobile devices)."
This use case explores the implications of this function. Specifically, how can a UE identify sensors that are present that can provide information sought by the UE?
In this particular use case, on a construction site, a crane is lifting a large object near a tower. Construction worker safety, efficient pursuit of tasks and other situational awareness to prevent disasters are important tasks. Instead of merely relying on the crane operator, this use case allows the 5G system to model and track the tower, crane and payload as it is in motion.
Figure 5.19.1-1: Group Sensing, to provide sensing services
In Figure 5.19.1-1, the UE is on a construction site. It seeks to identify sensors available at that site to provide sensing data potentially relevant to obtaining information for the user. These sensors - in the UE's proximity and available to provide sensor data (that is, this service is authorized,) comprise a sensor group.
Correspondence between Sensor Groups and other groups defined in stage 2 are neither implied by this use case nor excluded. There is no correspondence between Sensor Groups and other groups defined in stage 1.
The use case assumes that sensors that can form a sensor group are either UEs or communicate by means of a UE (as a kind of 'split terminal equipment (TE)' UE.)
It is essential to capture a 'synchronous' group of sensors and their movements in 3 dimensions in order to optimally combine the sensor data for the purpose of localization computations. This is particularly important when the viewer is near a large object or any non-static object must be modeled on all sides. This is an active area of research, for example in 6DoF Tracking and sensing. [44] The techniques described in this paper concern how to determine the 3D position and orientation of drones with synchronization, but it is clear that this is a fundamental requirement of a group of sensors providing input on a single physical object in (absolute or relative) motion.
The synchronous group of sensors form a logical set of devices whose data acquisition is critical for a particular task. The handling of this group is unique to this problem domain because (i) the need to synchronize the uplink transmissions of sensor measurement data of members of the group, so they can be combined in a timely way to produce a meaningful sensor measurement result of an object in motion, (ii) the dynamic nature of this group as the set of sensors that are appropriate to use can change often: the set of sensors prepared to obtain sensor measurement data of moving objects will change over time.
In this use case, a user's UE identifies a sensor group, through interaction with an AS. Part of identification of the sensors in the group is obtaining sufficient information that the user can become authorized to obtain sensor data, and the relevant service access information is obtained. The goal of the use case is to enable the acquisition of sensor data in a UE's proximity. The communication of the sensing data itself is out of scope of this use case.
Discovery of the kind described here can potentially be accomplished by different radio technologies, e.g. NR ProSe, UWB, IEEE 802.11, etc. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.2 Pre-conditions | Benoît inspects various construction sites. He has a UE equipped with a set of surveillance and appraisal applications.
On construction sites he visits, there are sensors deployed. Some are UEs, e.g. using NR-based sensing. Other sensors include video cameras, LiDAR equipment and passive infrared sensors. These are not, generally, installed directly in the terminal equipment, but rather use the terminal equipment to communicate, as shown in figure 5.19.2-1.
Figure 5.19.2-1: Sensors available to become a sensing group
In the scenario above, (a) is a UE that serves various sensors that are themselves not UEs. The means by which these sensors communicate with the UE is out of scope of this use case. They could be e.g. connected by means of a physical cable. (b) is a UE that is capable of 3GPP defined sensing. (c) is a UE that can operate the UE camera for sensing purposes.
As the RF measurement data contains sensitive information (e.g. location of sensing transmitter/receiver, information about objects) from the 3GPP system it is processed in the 5G network to produce sensing results. 3GPP sensing data is not shared outside of 5G network. Application enabler layer combines these results with other data like non-3GPP sensing data to produce combined sensing results.
In this example (a) and (b) are authorized and ready to send mobile originated sensing data to an AS.
The non-3GPP sensing data from UEs and sensing results from 3GPP network acquired by the AS can be combined in software in a manner that is out of scope of the 5G standard. In this use case, the combination is performed by the AS.
Benoît's UE, (c), is authorized to access the media (the sensing result output of the AS produced by taking account of the non-3GPP sensing data from (a) and (b), sensing results from 3GPP network) provided by the AS.
Benoît, using UE (c), monitors the construction site for safety and efficiency. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.3 Service Flows | Benoît's UE (c) uses functionality provided by the 5G system to seek to determine the existence of UEs (a) and (b), referring to Figure 5.19.2-1.
UE (c) has knowledge of the AS that accumulates sensor information that could be of interest.
UE (c) requests of the AS that accumulates sensor information for a sensor group: what sensors are in the proximity?
UE (c) is able to become authorized to receive information concerning the sensor group.
The Application Server (AS) requests the 5G system obtain a sensing group.
The 5G system will identify the set of sensing group members that provide the AS with sensor information that are in the proximity of UE(c). The 5G system will strive to synchronously locate 4 or more devices, to be localized within 10cm of accuracy, with accuracy of measurement within 5 ms of synchronization.
NOTE: It is impractical for the AS to continuously track the location of each of these UEs because their location can change and this information is needed only on demand. The group needs to be captured together, at the same time, since the sensor data of the group needs to be interpreted by UE (c) together.
The AS provides UE (c) with sufficient information to identify the sensing group that are ready to provide non-3GPP sensing data and sensing results as well as how to get sensing data from the sensing group from the AS.
UE (c) requests to obtain combined sensing results from the AS/sensing group. The sensors must be within proximity and their locations are known with great accuracy (within 10cm in 3D), with accuracy of measurement within 5 ms of synchronization.
UE (c) is authorized to obtain combined sensing results from the AS/sensing group.
UE (a), UE (b) provide non-3GPP sensing data and the network provides sensing results. These are received by the AS and combined. This combined sensing result is provided to UE (c) by the AS.
UE (b) is mobile and its position varies. UE (c) must identify its position with sufficient accuracy to interpret the sensing result meaningfully. Since UE (b) is mobile, its position and movements must be tracked to provide accuracy up to 10 cm, with accuracy of measurement within 5 ms of synchronization.
UE (b) leaves the proximity of UE (c). UE (c) identifies that UE (b) has left the sensing group.
Later, UE (b) returns to proximity of UE (c). UE (c) identifies that UE (b) has joined the sensing group, including its position within 10 cm, with accuracy of measurement within 5 ms of synchronization.
The crane operations are monitored by sensors (a) and (b), providing different perspectives by means of non-3GPP sensing data from non-3GPP sensors and sensing results from the 3GPP network to the AS. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.4 Post-conditions | Benoît, making use of UE (c), is able to ascertain with very high accuracy the location and movement of the entire group of UEs that can form a sensor group. The ability of UE (c) to identify the position and membership of the group continues over time, so that the current membership of the group is known, and that membership can change.
The AS is able to combine the non-3GPP sensing data acquired by non-3GPP sensors and sensing results acquired by 3GPP network from different perspectives and produce a useful 3D representation of the site to the site supervisor, who receives the combined sensing result by means of media delivered from the AS to the Benoît's UE (c). |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.5 Existing feature partly or fully covering use case functionality | There are requirements specified in 22.261, 6.37.2 to support ranging services that are relevant to this use case. These were developed in the FS_Ranging study. [43]
- The 5G system shall be able to support for a UE to discover other UEs supporting ranging.
- The 5G system shall be able to start ranging and stop ranging according to the application layer’s demand.
- The 5G system shall be able to provide mechanisms for a MNO, or authorized third-party, to provision and manage ranging operation and configurations.
- The 5G system shall be able to support ranging enabled UEs to determine the ranging capabilities (e.g. capabilities to perform distance and/or angle measurement) of other ranging enabled UEs.
- The 5G system shall be able to allow a ranging enable UE to determine if another ranging enabled UE is stationary or mobile, before and/or during ranging.
- The 5G system shall allow ranging service between 2 UEs triggered by and exposed to the application server.
Differences between this use case and the above ranging requirements, and ranging in general include:
- Sensing data is not acquired 'between UEs' but by means of different sensing technologies.
- It is not sufficient to discover other UEs that support sensing: these must be in the discovering UE's proximity and the discovered UE's precise location must be ascertained.
- In this use case, a 'sensing group' is formed, where in Ranging, all range information was acquired through interactions directly (or indirectly if relayed) between UEs. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.19.6 Potential New Requirements needed to support the use case | [PR 5.19.6-1] Based on third-party request, the 5G system shall be able to discover a suitable sensing group where sensing transmitters and receivers are within 100m range to be localized within 10cm of accuracy, with accuracy of sensing measurement process within 5 ms of synchronization.
[PR 5.19.6-2] Based on third-party request, the 5G system shall be able to discover a sensing group in the proximity of the UE that is requesting the service from the AS.
NOTE: This requirement assumes that a UE requests an AS to discover a set of sensing group members that have sensing functions that can provide sensing service. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20 Use case of Sensing for Parking Space Determination | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.1 Description | Sensing technology can improve the user experience in parking garage via enabling the vehicle and parking garage to get more information, e.g. information whether a parking space is available or not. The indoor/underground parking garage can install multiple Sensing receivers and Sensing transmitters throughout the concrete structure for detecting the availability of the parking space. The outdoor parking garage can also exploit multiple Sensing receivers and Sensing transmitters for detecting the availability of the parking space.
Another related use case is automated parking e.g. AVP (Automated Valet Parking) and AFP (Automatic Factory Parking) [45] where cars are provided with drive-path information to do automated parking in a given parking lot facility. Connectivity is an important component in automatic parking, and the 3GPP sensing technology can serve as the way to determine available parking spaces and the best route for a car to reach it.
The coverage could be either a public network or a private network specifically for the parking garage. For the Sensing receiver(s) and Sensing transmitter(s) indoor, see figure 5.20.1-1, one deployment scenario is that Sensing receiver(s) and Sensing transmitter(s) can be ceiling-mounted and located in such a way as to provide sensor coverage for the parking bays within the structure, where the Sensing transmitter and Sensing receiver can be co-located. Another deployment method is that some Sensing receiver(s) and Sensing transmitter(s) can be ceiling-mounted and some can be mounted on the floor or wall, where the Sensing transmitter and Sensing receiver can be separately located. For the Sensing receiver(s) and Sensing transmitter(s) outdoor, see figure 5.20.1-2, Sensing receiver(s) and Sensing transmitter(s) can be deployed at a relative high place to guarantee the coverage of the parking garage.
Figure 5.20.1-1: Parking space determination (indoor deployment)
Figure 5.20.1-2: Parking space determination (outdoor deployment)
Multiple methods can be used via using the sensing signals emitted from the Sensing transmitter to detect the target object/area and the sensing signals bounced/reflected. For example, the Sensing receiver can measure the reflected signal power of the target area [8]. Since the cars are usually made of metal materials and the ground is, on the contrary, covered by cement or plastic cement, these objects can vary significantly on the reflected signal power, thus the Sensing receiver can distinguish whether a parking space is taken by simply measuring the reflected signal power difference. Another example method is that the Sensing receiver(s) and Sensing transmitter(s) can identify a parking space availability by monitoring the target movements. Comparing with the stationary objects such as ground and poles, the Sensing receiver(s) and Sensing transmitter(s) can easily distinguish a car when a car is parking or leaving, thus by measuring the distance/angle/velocity, the Sensing receiver(s) and Sensing transmitter(s) can record that a parking space is occupied when a car is parking, and that a parking space is free when a car is leaving. Or the Sensing receiver(s) and Sensing transmitter(s) can also detect a parking space availability by generating a 3-D point cloud [9], then both the stationary and moving target can be easily detected in a 3-D point cloud especially with some backend data processing skills such as machine learning/deep learning. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.2 Pre-conditions | This use case is about a public multi-storey parking garage who has installed Sensing receiver(s) and Sensing transmitter(s) throughout the concrete structure to detect the positions of people, objects, and vehicles within the garage. The concrete structure can make coverage difficult and, especially in underground levels, coverage provided by external transmitters can be very poor. This parking garage can provide the information to the entering vehicles about the availability of the parking space.
Consider an example scenario shown in Fig. 5.20.2-1. A typical parking space is with length 5 m and width 2.5 m. So, from the horizontal dimension, the resolution requires to distinguish different parking spaces. The results can be aggregated by the parking garage operator and the parking state can be updated in seconds even when the coverage of the parking garage is poor, which helps avoid a wasting of time for the entering vehicles.
Figure 5.20.2-1: Example of sensing scenario |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.3 Service Flows | 1. James wants to park his car at the public parking garage in floor #B2. His vehicle, on entering the public parking garage, queries the parking garage service for availability of a parking bay in floor #B2.
2. The parking garage operator can activate the sensing devices in floor #B2 for sensing. Sensing receiver(s) and Sensing transmitter(s) sense the parking spaces without interfering each other or with bearable interference. The sensing results can be aggregated by the parking garage operator.
3. With the aggregated sensing results, the parking garage operator can send sensing results back with an addition to the dynamic map corresponding to the floor #B2 which shows the current status of parking bays in the structure.
4. James can see the available parking bays on his in-car display and choose a suitable one. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.4 Post-conditions | Thanks to sensing, James has found a parking space on the correct floor without issue, making his life easier during daily travel. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.20.6 Potential New Requirements needed to support the use case | [PR 5.20.6-1] The 5G system shall be able to provide sensing services in licensed and unlicensed spectrum.
[PR 5.20.6-2] The 5G system shall be able to authorize Sensing receiver(s) and Sensing transmitter(s) to participate in a sensing service.
[PR 5.20.6-3] Based on operator’s policy, the 5G system shall enable a trusted third-party to request the activation of the sensing service with specific KPI requirement, as well as deactivation of the same service.
[PR 5.20.6-4] The 5G system shall be able to support charging for the sensing services (e.g. considering service type, sensing accuracy, target area, duration).
[PR 5.20.6-5] The 5G system shall be able to provide a sensing service considering the interference to the Sensing service caused by the sensing operations between multiple Sensing transmitter(s) and Sensing receiver(s).
[PR 5.20.6-6] The 5G system shall be able to provide sensing with following KPIs.
Table 5.20.6-1 Performance requirements of sensing results for parking space determination
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Parking space determination
Indoor/ outdoor
95
0.5
0.5
0.1
N/A
2.5m perpendicular to the parking space
5m parallel to the parking space
N/A
1000
1
1
5
NOTE: The terms in Table 5.20.6-1 are found in Section 3.1.
5.21. Use case of Seamless XR streaming |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.1 Description | Extended Reality (XR) is an important 5G use case. Split-rendering architectures, where the heavy XR video rendering computation is done at the application server based on control information received from the UE, poses strict Quality-of-Service (QoS) requirements in terms of round-trip latency and throughput for delivering the video and control info.
It is therefore crucial to always maintain a high-quality wireless link for XR. Thus, it is critical to predict and adapt fast to wireless channel changes. This is especially true in Millimeter wave bands in which the channel and propagation characteristics are very sensitive to user and environment changes such as blockages, user motion or rotation.
To adapt fast to the wireless channel changes, an understanding of the wireless channel dynamics is required. The channel dynamics depend on understanding the surrounding environment such as the transmitter and receiver locations, geometry of the buildings, moving scatterers, location and material of blockers, etc.
Interestingly, most of the XR streaming devices (e.g., 5G phones, AR/VR headsets) and third-party entities that support 5G (i.e., 3GPP sensors) also support non-3GPP sensors, such as RF sensors, Inertial Measurement Units (IMU) sensors, RGB cameras, position sensors, and others.
In light of the availability of 3GPP and non-3GPP sensors and the need of environment understanding, it is therefore natural to utilize the overall sensing information to acquire an understanding of the surrounding environment.
To this end, a “Sensing RF Map Service” can be envisioned that enables the collection of sensing information from 3GPP and non-3GPP sensors, process and provide that information to a sensing service.
• The input to this “Sensing RF Map” service could be 3GPP sensing data and non-3GPP sensing data from multiple sensors, e.g., RF sensing data, XR user position, camera images, depth maps, hand tracking, motion type, etc. Such input could be produced by a 5GS entity (e.g,, UE or RAN entities) or by a third-party (e.g., surveillance camera). It is essential to note that the collection of this sensing should be done with appropriate user consent and adherence to regional and national regulations.
• The processing of 3GPP and non-3GPP sensing data can be performed within the 5G system or outside the 5GS (for example on an application server). In this use case, we are focus on processing in the 5G system.
• The output of this service (i.e., sensing result) is some understanding of the environment and/or impact to communication performance of a service consumer, e.g., RF environment mapping, etc. When sensing result is shared outside of the 5GS, the appropriate consent and permissions for sharing this information is required.
• The consumer of this service could be a third-party application or other entities in the 5GS.
It is important to note that while this service is provided by 5GSor edge server, business model for the monetization of this service would need to consider factors such as the entities involved in sensing, the transfer of the non-3GPP sensing data and the value of sensing RF Map information produced by these entities as well as the value to the consumer of the service. These considerations are also required for scenarios involving 3GPP only sensing operations but additional considerations are indeed required for non-3GPP sensing data which is generated outside the 5G system. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.2 Pre-conditions | Jose is playing a game inside a gaming arena using a VR headset that is connected to a RAN entity. The VR headset, RAN entity and third-party surveillance system are configured to provide “3GPP sensing data and non-3GPP sensing data” to the “Sensing RF Map Service”.
The VR headset is equipped with 3GPP sensors and non-3GPP sensors such as, IMU sensors and cameras, and it can provide sensing inputs e.g., 3GPP sensing data, headset pose and location, velocity, images of the environment and processed images (such as motion pattern and maps). Also, the VR headset can provide communication reference signal measurements or reports to the Sensing RF Map Service.
RAN entity has 3GPP NR RF capabilities and can provide 3GPP sensing data to the 5GS, which processes and provides sensing results to the sensing RF Map service.
The gaming arena also has cameras deployed by a trusted third-party surveillance camera company and can provide images of the environment and processed images (such as motion pattern and maps) to the 5GS. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.3 Service Flows | 1. Jose is playing a game using a VR headset in an arena with some obstacles and other gamers in the environment. Jose moves through the arena and approaches a communication blocker which could potentially impact the performance of the wireless communication between VR headset and the RAN entity.
2. Jose’s VR headset, RAN entity and the third-party surveillance system provide 3GPP sensing data, and non-3GPP sensing data to the Sensing RF Map Service. The Sensing RF Map Service combines the 3GPP and non-3GPP sensing data to produce a sensing result which is a comprehensive RF map of the environment surrounding the headset (e.g. information such as the location of RAN entities, reflectors, static blockers, etc. and an indication of wireless link blockage event, e.g., people walking by blocking the 5G link).
NOTE: An RF map is a spatial/geographical representation of environmental characteristics (e.g., wireless propagation and objects such as RF signal reflectors, blockers in the environment). This map enables improvements in areas such as radio resource management, beam management, mobility and user applications.
3. 5GS uses the RF map to predict that Jose’s communication link is about to be blocked if he comes close to the blocker and such prediction is sent to communication and/or the application layers of the game. For example, the application layer adjusts the content of rendered video frames accordingly (e.g., lowers the frame rate, adds a virtual obstacle in the rendered video to prevent Jose from coming close to the blocker.) |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.4 Post-conditions | Jose enjoys seamless XR gaming application without video frame drops, i.e., no video glitches. This is because Sensing RF map Information was leveraged to assist both the communication service as well as the application. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.21.6 Potential New Requirements needed to support the use case | [PR 5.21.6-1] Subject to user consent and regulatory requirements, based on operator policy, the 5G system shall be able to support secure means for RAN entities and authorized UEs to provide 3GPP sensing data to a 5G network for processing.
[PR 5.21.6-2] Subject to user consent and regulatory requirements, based on operator policy, the 5G system shall be able to collect non-3GPP sensing data from trusted parties.
[PR 5.21.6-3] Subject to user consent and regulatory requirements, based on operator policy, the 5G system should be able to support the combination of the 3GPP sensing data and non-3GPP sensing data to derive combined sensing result.
[PR 5.21.6-4] Subject to user consent and regulatory requirements, based on operator policy, the 5G system shall be able to expose the combined sensing results to a trusted third-party service provider. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22 Use case of UAVs/vehicles/pedestrians detection near Smart Grid equipment | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.1 Description | In the future, there will be more and more autonomous driving devices, such as drones and self-driving cars. These devices have a strong ability to affect the surrounding environment, which may have an impact on the operating equipment in Smart Grid.
For example, vehicles, such as UAVs and engineering vehicles, may affect the operation safety of multiple links such as power generation, power transmission, and power transformation.
At present, multiple scenarios of power transmission and transformation in the Smart Grid industry have potential combination with integrated sensing and communication technology. Among them, there are related accidents caused by hooking or damaging transmission lines by vehicles in the power transmission process. Thus, the transmission stations need to identify and warn vehicles. In the process of power transformation, there are security risks such as candid photography and attack by drones, getting electric shock when approaching, etc. In a word, there are requirements for perimeter intrusion detection and UAV detection in substations. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.2 Pre-conditions | There are existing 5G base stations deployed near the transmission stations and substations, which can provide constant remote sensing of the location of intruders in the coverage area including UAVs, engineering vehicles and pedestrians. Network operator A can use these 5G base stations to provide 5G sensing service for the Smart Grid operator X, including sensing the motion trail of the UAVs, vehicles and pedestrians in their working area.
The Smart Grid Operator X uses the 5G sensing service provided by 5G network Operator A to detect potential intrusion/approaching of UAVs, vehicles and pedestrians near the transmission stations and substations.
The Smart Grid operator sets the border of restricted area for the transmission stations/lines and substations in which no UAVs, vehicles or pedestrians can be access, and define a warning distance value. Once a UAV, traffic vehicle, or pedestrian is detected that its distance from the border is less than the warning distance value, the 5G system will report the event to the Smart Grid operator to send the alerting message.
The 5G base stations can sense the location of the UAV/traffic vehicle/pedestrian constantly and send these data to the 5G core network. Then the sensing node and computing node can analyse and predict the path of the UAV or pedestrian according to a large amount of data and give early warning of potential security risks. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.3 Service Flows | 1. The Smart Grid Operator X requests sensing service from network operator A to collect sensing data in the defined area (i.e., the park covering transmission stations and substations). The network operator A configures the base stations located in the defined area to perform sensing.
2. The 5G RAN constantly collects 3GPP sensing data of the location of UAVs/vehicles/pedestrians in the defined area and send the sensing data to the 5G core network with a defined frequency to obtain the sensing result (i.e., the distance between the UAV/traffic vehicle/pedestrian and the border or motion trail).
3. The 5G system will send notification to UAVs/vehicles/pedestrians with UE that they are near a restricted area. 5G system will also report the sensing results to the Smart Grid operator. The Smart Grid operator determines to send the alerting message to the intruding/approaching UAVs/vehicles/pedestrians based on the sensing results. In addition, the staffs working in the park respond to the emergency and prepare to intercept the intruding/approaching if the UAVs/vehicles/pedestrians are not away. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.4 Post-conditions | The UAVs/vehicles/pedestrians are away from the defined area. Potential security risks are avoided. Thanks to the wide-area and constant sensing capability of the 5G base station, and the precise data processing and prediction by the 5G core network, the safety supervision of the Smart Grid is improved. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.5 Existing features partly or fully covering the use case functionality | In TS22.261, there are existing requirements on information exposure:
In clause 6.10:
The 5G system shall be able to:
- provide a third-party with secure access to APIs (e.g. triggered by an application that is visible to the 5G system), by authenticating and authorizing both the third-party and the UE using the third-party's service.
- provide a UE with secure access to APIs (e.g. triggered by an application that is not visible to the 5G system), by authenticating and authorizing the UE.
- allow the UE to provide/revoke consent for information (e.g., location, presence) to be shared with the third-party.
- preserve the confidentiality of the UE's external identity (e.g. MSISDN) against the third-party.
- provide a third-party with information to identify networks and APIs on those networks. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.22.6 Potential New Requirements needed to support the use case | [PR 5.22.6-1] Subject to operator policy, the 5G system shall enable the network to expose a suitable API to a authorized third party to provide the information regarding sensing results.
[PR 5.22.6-2] Based on operator policy, the 5G system may be able to utilize sensing assistance information exposed by a trusted third-party to derive the sensing result.
[PR 5.22.6-3] The 5G system shall be able to support the following KPIs:
Table 5.22.6-1 Performance requirements of sensing results for UAVs/vehicles/pedestrians’ detection near Smart Grid equipment
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Sensing for the use case in Smart Grid NOTE 2
Outdoor
95
≤0.7
N/A
UAV: ≤25
Pedestrian: ≤1.5
Vehicle: ≤15
N/A
N/A
N/A
≤5s
≥10Hz
[≤5]
[≤5]
NOTE 1: The terms in Table 5.22.6-1 are found in Section 3.1.
NOTE 2: The typical size (Length x Width x Height) of UAV is 1.6m x 1.5m x 0.7m, the typical size of pedestrian is 0.5m x 0.5m x 1.75m, and the typical size of engineering vehicle is 7.5m x 2.5m x 3.5 m. The size of the park of Smart Grid depends on the real environment.
NOTE 3: The safe distance between pedestrian/vehicle and transmission station/line is 0.7m/0.95m [46]. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23 Use case on AMR collision avoidance in smart factories | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.1 Description | Autonomous mobile robots (AMR) are currently being introduced in many logistics operations, e.g. manufacturing, warehousing, cross-docks, terminals, and hospitals. Compared to an automated guided vehicle (AGV) system in which a central unit takes control of scheduling, routing, and dispatching decisions for all AGVs, AMRs are robots built with intelligence to autonomously move and perform tasks. AGVs is expected to further be evolved into intelligent AMR to meet the demand of intelligent factory.
Compared to AGVs which move on transport paths guided by rails, magnetic markers etc. AMRs can travel automatically without derivatives or guides. AMRs don’t rely on predetermined paths, they can easily adjust routes as user demands change, so AMRs have wider mobile range and more flexibility. AMRs can not only stop on time to avoid humans and other obstacles, but also adjust its route for its destination. However, during the AMR working process, the sensing range of a single AMR is limited and the AMR surrounding environment status may be not detected in time. For example, People or other machines that suddenly appear from behind the large factory equipment can affect the driving safety of the AMR. So, it is very challenge for AMR to get accurate and continuous sensing information along its route.
5G base stations can be deployed in a factory not only to provide communication capabilities for equipments in the factory but also sense the surrounding environment e.g. obstacles or people in the trajectory of AMRs. Base stations transmit the sensing signals and receive the reflected signals to get sensing information, then reports the real-time 3GPP sensing data to the core network. The core network can process and analyze the 3GPP sensing data for outputting the sensing result. Such sensing result can be exposed to a trusted third-party e.g. automation platform of the factory to enables AMRs to know more information about the surrounding environment to improve efficiency and driving safety.
In addition, when there are obstacles (e.g., the large factory equipment) to block the transmission of radio signals or AMR trajectory is across indoor and outdoor, multiple base stations with sensing capability can work together to improve the sensing accuracy and sensing service continuity. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.2 Pre-Conditions | 5G Network operator ‘MM’ provides 5G sensing service in the factory of Company A. Its 5G system has been deployed covering the factory to provide continuous sensing service indoor and outdoor.
Company A has placed two AMRs (AMR 1 and AMR2) in its factory for moving goods from workshop A to workshop B. At the same time, there are people moving around in both workshops, and other goods or tools may be temporarily placed on the route of the AMRs. The people walking in the workshop and the goods may block the AMR route, jeopardizing production safety. In addition, the two AMRs may collide considering their flexible routes.
The AMRs of Company A uses ‘5G Sensing Service’ provided by 5G network Operator ‘MM’ during they are working. In order to ensure data security, the related sensing data is not permitted to be delivered outside Company A. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.3 Service Flows | Figure 5.23.3-1 shows the route change of AMR1 in the process of carrying goods.
Figure 5.23.3-1: Sensing People or obstacles detection in smart factory
1. The AMR#1 is delivering the car parts from workshop A to Sam who is near the assembly line in workshop B.
2. The 3GPP sensing data collected by Base station #1/RAN, the 5G network processes the 3GPP sensing data to obtain sensing results and detects the proximity of obstacles along the trajectory of AMR#1. 5G system provides the sensing result to the AMR#1, and AMR#1 then re-route and bypass the obstacles based on the sensing result.
3. AMR#1 is leaving the Workshop A and across from indoor to outdoor.
4. The 3GPP sensing data is collected by Base station #2/RAN, the 5G network processes the 3GPP sensing data to obtain sensing results and detects the proximity of Daming who is walking across the trajectory of AMR#1. 5G system provides the sensing result to the AMR#1, then AMR#1 stops to wait for Daming to leave.
5. AMR#1 enters the Workshop B.
6. The 3GPP sensing data is collected by Base station #3/RAN, the 5G network processes the 3GPP sensing data to obtain sensing results and detects the AMR#2 near AMR#1. 5G system provides the sensing result to AMR#1, then AMR#1 re-routes and bypasses the obstacles based on the sensing result.
7. When AMR#1 enters the coverage of Base Station #4, using the 3GPP sensing data from Base station #4/RAN, the 5G network processes the data to obtain sensing results and detects the proximity of John who is behind a large machine. 5G system provides the sensing result to AMR#1, then AMR#1 stops and waits for John to leave.
AMR#1 successfully delivers the car parts to Sam in workshop B. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.4 Post-Conditions | Based on the communication and sensing services provided by the 5G network, the AMRs in the factory operate normally and reduce safety incidents. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.23.6 Potential New Requirements needed to support the use case | [PR 5.23.6-1] The 5G system shall be able to provide the continuity of sensing service for a specific target object, across indoor and outdoor.
[PR 5.23.6-2] The 5G system shall be able to provide a secure mechanism to ensure sensing result data privacy within the sensing service area.
[PR 5.23.6-3] The 5G system shall be able to support the following sensing related KPIs:
Table 5.23.6-1 Performance requirements of sensing results for AMR collision avoidance in smart factories
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
AMR collision avoidance in smart factories
indoor/outdoor
99
≤1
N/A
1
N/A
1
1.5
˂500
0.05
N/A
5
NOTE 1: The terms in Table 5.23.6-1 are found in Section 3.1.
NOTE 2: The KPI values are sourced from [47]. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24 Use case on roaming for sensing service of sports monitoring | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.1 Description | Sports monitoring application describes the case of a human being monitored when doing exercise via utilizing wireless signals instead of cameras, or wearable devices. With enhanced privacy preservation, wireless signals that propagated in the 5G system (e.g. between 5G UE and 5G UE, and between the radio access network and device) can be further reused and processed to retrieve the target sensing object’s characteristics [48]. In a sports monitoring situation, the target object is human and the target object’s characteristic is human body gesture. By comparing the detected body gesture with the correct body gesture when people are doing exercises like sit-ups, and push-ups, this sport monitoring application will give feedback, for example, it can count the number of the exercise e.g. sit-ups, and calculate calories. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.2 Pre-conditions | The sports monitoring application provider has a service agreement with mobile operator A in country X and mobile operator B in country Y.
Mobile operator A in country X and Mobile operator B in country Y has roaming agreements.
Bob installs a sports monitoring application on his mobile phone and subscribes to the sensing service with mobile operator A. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.3 Service Flows | 1. Bob is a sports fan and does exercise every day. He travels to country Y and gets accommodation in hotel M, where mobile operator B’s sensing service is available. Bob triggers the application and selects his favorite sport, sit-up.
2. The sensing request is received by Mobile operator B and Mobile operator B then authorize whether the sensing request can be satisfied or allowed, e.g. by verifying the sensing system availability (infrastructure, sensing modalities), the location of the sensing service, local privacy restrictions, the roaming agreement with Bob’s home mobile operator A in country X, the identity of Bob and etc.
3. If the authorization succeeds, mobile operator B authorizes and provides such 5G sensing service to Bob with required performance targets and requirements and Bob pays for the sensing service to mobile operator B.
• Mobile operator executes sensing measurement process (e.g. micro doppler shift) with the sensing entities (e.g. RAN entities and the roaming UE, or CPE and the roaming UE) to obtain 3GPP sensing data;
• Mobile operator processes the 3GPP sensing data to derive sensing result and expose it to the application server.
4. If the authorization fails, mobile operator B does not provide such 5G sensing service to Bob and potentially, Bob or the sports monitoring application server will receive the reason of why mobile operator B cannot provide such sensing service.
5. Later, Bob left the hotel to run around the area near the hotel. When Bob runs near a restricted area where sensing service is not allowed, the Mobile operator B revokes the authorization and terminates the sensing service. Potentially, Bob or the sports monitoring application server will receive the reason of why mobile operator B cannot provide such sensing service. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.4 Post-conditions | Bob can enjoy the sports monitoring application even when he travels in another country. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.5 Existing features partly or fully covering the use case functionality | Roaming-related authentication and charging may refer to existing requirements as defined in Clause 9 in TS22.261:
The following set of requirements complement the requirements listed in 3GPP TS 22.115. The requirements apply for both home and roaming cases.
The 5G core network shall support collection of charging information for alternative authentication mechanisms
The 5G system shall be able to generate charging information regarding the used radio resources e.g. used frequency bands. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.24.6 Potential New Requirements needed to support the use case | [PR 5.24.6-1] Based on operator policy, the 5G System shall be able to provide the 5G wireless sensing services in case of roaming.
[PR 5.24.6-2] 5G network shall provide means for mobile operator to provide / revoke authorization for the operation(s) of a 5G wireless sensing service based on location, time, specific KPI level and the origin of the request.
[PR.5.24.6-3] The 5G system shall be able to provide 5G wireless sensing service with the following KPIs:
Table 5.24.6-1 Performance requirements of sensing results for sports monitoring
Scenario
Sensing service area
Confidence level [%]
Human motion rate accuracy
[Hz]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency[ms]
Refreshing rate [s]
Missed detection [%]
False alarm [%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Sports monitoring
Indoor (living room)
95
0.05
NOTE 2
0.07
NOTE 3
N/A
N/A
N/A
N/A
N/A
N/A
60s
1min
N/A
N/A
NOTE 1: The terms in Table 5.24.6-1 are found in Section 3.1.
NOTE 2: Sit-up rate = 30 times/min as reference, 0.05Hz corresponds to 3 times/min.
NOTE 3: Push-up rate = 40 times/min as reference, 0.07Hz corresponds to 4 times/min. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25 Use Case on immersive experience based on sensing | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.1 Description | Sensing based on the 5G signals is a technology using the difference between wireless signals and its reflective signals including the Doppler frequency shift, time of flight (ToF), amplitude variation and so on, to sense the surroundings. The information collected during sensing cannot be directly understood by human beings, providing good privacy protection. Along with 5G stepping into the home, more interesting functions can be introduced based on the sensing using 5G signals.
It will be fantastic to have an immersive audio and light experience when watching movies and listening to music in the home. The speakers can provide this kind of audio experience if they can know the position of each other and also the user. Usually, to have immersive audio experience, several speakers are needed. Ranging technology can help the speakers to determine position relative to each other and this gives a chance for speakers to provide a fantastic experience together to a listener by adjusting the audio field at a place when the listener stays at that special place.
Different from ranging service where only the UE’s position can be identified, sensing can identify the relative position of the reflector even the reflector is not a UE. If the speakers can obtain the sensing results, this will give a chance to the speakers to follow the listener and provide an immersive audio experience, even when the listener is moving around. The speakers can follow the position of the human and adjust the audio field anytime anywhere. Similar to the audio case, if the smart light can obtain the sensing results, the light can also track the user anytime anywhere to provide an immersive experience.
For the immersive experience scenario, the audio field and light adjustment should be based on the user’s position. For example, the smart screen, lights and speakers can provide immersive sound and light experience for the user who sits at the sofa area (around 2~3m2 area) and at the same time lower the sound volume and turn down the light at other places of the home avoiding the interference on others. The sensing node (e.g., smart screen, i.e., a UE) can perform sensing operations to track the user’s movement using RF signals [29] and provide the information to the speakers and lights for adjustment. Then both the lights and speakers can provide a cosy zone around the user.
In home, there is usually a mixed deployment of smart screen, lights, and speakers. We take smart screen as the sensing node as an example. Assume that the smart screen, the smart lights, and the speakers are placed in the drawing-room, and the smart screen can sense the object in the room as shown in Fig. 5.25.1-1. The smart screen (i.e., UE) can receive the reflected sensing signals transmitted by UEs in the room, or by the gNB to perform sensing operation on the user in the room.
Figure 5.25.1-1: Example deployment of the smart screen, speakers and lights
The smart screen can track the user’s position via sensing operation. Each user is assumed to occupy an area 0.5m * 0.5m and move with a speed lower than 2m/s in horizontal dimension. Then the audio field and light can be adjusted based on the user position (e.g. an area 1.5m * 1.5m) to avoid the experience deterioration even when the user is walking around.
To identify multiple users in the room, in the horizontal dimension, the resolution requires to distinguish objects as large as 0.5m. With higher accuracy on distance, each user in the room can be identify with a more accurate location such as 0.2m, the light and audio can target the user better [30]. If the results are accurate but out of date due to the movement of the user, this will also degrade the experience as the cosy zone is lagging the movement of the user. Consider the audio field and lights are adjusted in an area with 1.5m * 1.5m, to avoid the experience deterioration, the service latency should smaller than 0.5m/2m/s, i.e., 250ms. To track the user’s movement, the results need to be refreshed in 250ms, i.e., 4 times per second (4 Hz).
Figure 5.25.1-2: Example of accuracy required for the use case |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.2 Pre-conditions | This use case is about Tom’s home theatre plan. Tom bought a set of speakers, and placed them in his home to create an immersive audio experience. Tom also bought a set of smart lights to create an immersive light experience. Together with a smart screen in home, these smart devices compose the home theatre. According to the instruction of the speakers, and also based on the availability of audio and power wiring Tom distributed the speakers in his living room. The smart lights are installed on the ceiling of the room. After detecting the position of each distributed speakers, the home theatre system can adjust the audio field in Tom’s home. After detecting the position of smart lights, the home theatre system can control the light variation in home. This detection mechanism of the speakers and smart lights is outside the scope of this document.
There exists a sensing device in the home, which can sense Tom’s position without requiring Tom to take a UE with him. For example, the sensing node is the smart screen belonging to the home theatre system at Tom’s home. Based on the sensing results from the sensing node, the home theatre system can adjust the audio field and light based on the sensing results.
The home theatre system can be deployed by user where the smart screen, lights and speakers communicate with each other via direct device communication using unlicensed band. As an alternative, the home theatre system can be deployed with the help of operator using licensed band when the units of the home theatre system are in coverage. In this case, operator can control the performing of sensing operation based on the location of the deployment of the home theatre. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.3 Service Flows | Tom activates his home theatre and the speaker begin to play music based on the audio on demand. The sensing device in the room (i.e., Smart screen) performs sensing operation and the sensing results begin helping adjust the audio field.
Figure 5.25.3-1 Immersive experience with tracking light and sound
Tom immerses himself in music and begins to dance. The sensing device (e.g., smart screen) tracks Tom’s position via the processing of the receiving sensing signals, the Sensing result is calculated, and the Sensing result is sent to the control unit of the home theatre system.
Based on the sensing results, the home theatre system can adjust the audio field and lights according to Tom’s movement.
Thanks for the sensing service provided by the sensing node (e.g. smart screen), no matter where Tom stands, he always experiences the best surround sound and tracking light. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.4 Post-conditions | Thanks to the sensing service provided in the intelligent home, Tom can have an immersive experience via his home theatre. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.25.6 Potential New Requirements needed to support the use case | [PR 5.25.6-1] The 5G system shall be able to configure and authorize sensing for a Sensing device or a group of Sensing devices when using licensed spectrum based on the Sensing device’s location.
[PR 5.25.6-2] The 5G system shall be able to enable a Sensing device to perform sensing with licensed band under operator’s control based on the Sensing device’s location.
[PR 5.25.6-3] The 5G system shall be able to enable UEs without 5G coverage to use unlicensed spectrum to perform sensing.
[PR 5.25.6-4] Subject to user consent and national or regional regulation, based on operator policy, the 5G system shall be able to allow a Sensing device to provide sensing results to a trusted third party.
[PR 5.25.6-5] The 5G system shall be able to provide sensing with following KPIs:
Table 5.25.6-1 Performance requirements of sensing results for immersive experience
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Immersiveexperience
Indoor
95
0.5
0.5
0.1
N/A
0.5
N/A
250
(granularity of field is 1.5m x 1.5m)
0.25
5
5
NOTE: The terms in Table 5.25.6-1 are found in Section 3.1. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26 Use case on accurate sensing for automotive manoeuvring and navigation service | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.1 Description | It is forecasted that there will be approximately 8 million autonomous or semi-autonomous vehicles on the road by 2025 [49]. NR wireless sensing will assist with automotive manoeuvring and navigation, especially in scenarios where single car-mounted sensors collecting information is not enough for making safe and reliable decisions, e.g. to avoid a collision, pedestrians, etc.
This scenario reuses the use case as defined in section 5.8, where NR wireless sensing is utilized to assist automotive manoeuvring, i.e. sensing results play an important role in making the manoeuvring decisions. However, the sensing environment when RAN entities and UEs execute the sensing measurement process may be subject to high interference (e.g. interference caused by adjacent RAN entities, radars, fake base stations) and cause the sensing information as collected to be wrong.
The sensing information provided to the Automated Driving System (ADS) server needs to be fully trustworthy: reliability, integrity, high confidence level and protection against tampering are key aspects. Users (and third parties) should not be able to fraud the ADS Server by tampering with the sensing information in order to influence the manoeuvring decision. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.2 Pre-conditions | Refer to 5.8, where Bob’s vehicle is detected to be blocked by other vehicle and cannot do the decision for autonomous driving with sensors collected data (e.g. from lidar, radar, camera, etc). Bob recognizes the needs of 5G system assistance and requests 5G System for coordination of the sensing service. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.3 Service Flows | 1. Bob drives in downtown, and is approaching a crossroad with regulatory signs, where the sensors on Bob’s vehicle are blocked by other vehicles. Bob’s vehicle cannot see the surroundings and Bob sends the sensing request to 5G system.
2. The 5G system gives instructions about how to proceed the sensing to Bob’s vehicle and Bob’s vehicle selects Joe’s vehicle to assist the sensing service. Joe’s vehicle transfers the sensing contextual information to Bob’s ADS server.
3. Bob’s ADS server utilizes the sensing contextual information to do the decision, Bob decelerates before the traffic light.
4. Bob continues the journey and drives in a tunnel (10km length), the sensors on Bob’s vehicle are detected to be blocked by a big truck. Bob sends the sensing request to 5G system.
5. The 5G system selects RAN entities (e.g. road side units) to assist the sensing service. The 5G system transfer the sensing information collected by RAN entities to Bob’s ADS server.
6. Bob’s ADS server utilizes the sensing information to do the decision, and Bob accelerates to overtake the big truck.
7. Bob continues the journey and drives in a desert area and some sensors on Bob’s vehicle are detected to be broken (e.g. because of heat). Bob’s vehicle sends the sensing request to 5G system.
8. The 5G system selects RAN entities (sparsely deployed in desert area) to assist the sensing service. With the feedback from RAN entities, the 5G system transfer the sensing information together with the indication of confidence level as 60% to the ADS server.
9. Bob’s ADS server utilizes the indication and sensing information, and decides to return to Level 0 driving for safety. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.4 Post-conditions | Bob’s vehicle is able to drive with high reliability by utilizing accurate 5G sensing service. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.26.6 Potential New Requirements needed to support the use case | [PR 5.26.6-1] The 5G system shall be able to determine the confidence level of the sensing results. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27 Use case public safety search and rescue or apprehend | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.1 Description | The ability to quickly locate an individual that is either missing (search and rescue) or is a suspect in an illegal activity (apprehend) is very important for public safety. Statistics show that the quicker a missing person can be found the higher the possibility they can be found in good condition. Similarly for a suspect in an illegal activity, the quicker they can be located the less likely they can hide or commit another illegal activity. These activities can be in both an outdoor environment and indoors.
Leveraging the sensing capability of a 3GPP network and integrating the feature with other 5G capabilities (e.g., metaverse, augmented reality, location, network relay, etc.) can provide a huge advantage to public safety.
In an outdoor example, an elderly person with Alzheimer’s Disease wanders off into the woods and does not know how to return. The longer it takes for public safety to locate this person the more likely they may suffer injuries or medical issues, such as dehydration, cuts, bruises, broken bones, or worse, death. Another example is an individual who robs a bank and escapes into the nearby forest and swamps. The longer it takes to track down and find the individual, the more difficult it becomes and the bigger the risk of them taking hostages, hurting others, or escaping completely.
With the density of base station deployments and with large numbers of 5G enabled UE’s, the 5G coverage includes a large amount of the territory of most countries. These base station signals and the signals of UE’s can also be used to sense the environment for object detection.
Assumptions for this use case:
• Trusted third-party applications for interpreting and presenting the data to public safety is required,
• Devices must be trusted and communicate information togethers; and
• Precision 3-axis location is needed.
An indoor example would be a firefighter entering a building with limited or no visibility using sensing integrated with firefighter heads-up displays/UEs (metaverse/AI/ML) sensing can better allow the firefighters to locate possible people trapped inside and allow them a better view of the rooms as they work through the building.
Indoor 5G coverage can be challenging but leveraging features like UE-to-UE relay, UE-to-Network Relay, UE-to-Network multi-hop relay and UE-to-UE multi-hop relay and allowing the devices to work together can provide good coverage and capability to provide indoor sensing services in this challenging environment.
Assumptions for this use case:
• The use of UE-to-Network Relay and UE-to-UE relay can provide a more reliable connectivity.
• Devices must be trusted and communicate information togethers; and
• Precision 3-axis location is needed. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.2 Pre-conditions | 1) Operator A’s network supports sensing capability with their base stations and have 3GPP sensing enabled UEs on their network.
2) Local public safety officials have a relationship with Operator A allowing them to access the networks sensing capability and service.
3) Appropriate security and privacy requirements are in place between the operator and the public safety organization. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.3 Service Flows | 1) Public safety is notified of a need to search for an individual. This could be either a search and rescue, or an apprehend scenario and could involve both indoor and outdoor environments.
2) Operator A’s network is 3GPP sensing enabled and public safety’s UEs are 3GPP sensing enabled.
3) Public safety personnel begin searching for the individual using both traditional methods, UAVs and UE’s with 3GPP sensing capabilities.
4) Depending on coverage there may be a need to leverage indirect network connections.
5) Using the 3GPP sensing data and non-3GPP sensing data public safety can quickly locate the person of interest.
6) The individual is located using the combinations of capabilities. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.4 Post-conditions | The individual is located faster than without 3GPP sensing capability. The additional harm that might have occurred to the individual or community is avoided. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.5 Existing features partly or fully covering the use case functionality | TBD |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.27.6 Potential New Requirements needed to support the use case | [PR.5.27.6-1] The 5G system shall support exposing the information of sensing result (e.g., location, relative location, velocity vectors, relative headings, etc.) to the trusted and secure mission critical applications.
[PR.5.27.6-2] The 5G system shall support mechanisms for combining 3GPP sensing data and non-3GPP sensing data (e.g., body cameras.) depending on location, availability of non-3GPP sensing data, and public safety applications.
[PR.5.27.6-3] The 5G system shall support security and protection of the 3GPP sensing data, non-3GPP sensing data, and sensing results.
[PR.5.27.6-4] The 5G system shall provide a secure sensing service for Mission Critical Services.
[PR.5.27.6-5] The 5G system shall be able to provide the sensing service with the following KPIs:
Table 5.27.6-1 Performance requirements of sensing results for public safety search and rescue or apprehend
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Search and Rescue/Apprehend
Outdoor/Indoor
99
≤ 0.5
≤ 1.0
Pedestrian: ≤1.5
Pedestrian: ≤1.5
3
Horiz: 5
Vert: 5
≤1s
≥10Hz
[≤3]
[≤3]
NOTE: The terms in Table 5.27.6-1 are found in Section 3.1. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28 Use case on Vehicles Sensing for ADAS | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.1 Description | Advanced Driving Assistance System(ADAS) uses various sensors (Wireless Sensing millimeter wave radar, lidar, monocular / binocular camera and satellite navigation) installed on the vehicle to sense the surrounding environment at any time during the driving process, collect data, identify, detect and track static and dynamic objects, and carry out systematic calculation and analysis in combination with navigation map data, so as to make the driver aware of the possible dangers in advance, and effectively increase the comfort and safety of driving.
Figure 5.28.1-1 ADAS overview
There is an opportunity for 3GPP New Radio (NR) based sensing technologies to be added into ADAS. 5G based wireless sensing service could improve the ADAS reliability and quality.
The ADAS has the map information, the real time location/ trajectory of the car and can assist the car driving, e.g. stop the car for avoiding collision. The car (as 3GPP UE) is equipped with 3GPP NR based sensing technology. When the UE initially accesses the 5G network, the UE is authorized by the 5G network to participate in sensing under the operator’s control. The 3GPP sensing data is from the NR based sensor, and the sensing result is sent to the ADAS system of the car. Collaborating with other non-3GPP sensing devices/technologies, NR based sensing result as input to ADAS could improve the comfort and safety of driving.
The vehicle as a 3GPP UE based sensing sensor is important for automotive use cases, it can operate under network control & complements network-based sensing. Network based sensing alone cannot fully address the automotive use case needs, e.g.:
1. There may be a blockage from the network (base station) to the sensed target.
2. ADAS concerns more on relative positioning other than absolute position. Network based sensing introduces additional errors (due to compounding errors of two separate positions).
3. Automobiles applications may determine sensing priority/performance requirements locally (not visible to gNB).
UE based sensing resources allocations can be under the control of the network (e.g. base station). For UEs inside the coverage of enhanced network for sensing, resources can be directly controlled. Sensing results of the automobiles can be shared via 3GPP connections (Uu or PC5).
It is expected that the 3GPP NR sensing service for ADAS should meet the requirement and level of commercial ADAS radar sensing performance. Based on requirements for existing automotive sensors, the RF based sensing requirements for automotive applications are as the following table.
Table 5.28.1-1 RF based sensing requirements for automotive applications of existing automotive sensors
Parameter
Typical Automotive Radar KPIs
Long Range Radar [10][12][20][50][51][52]
Short Range Radar [10][12][20][52][53][54]
Maximum range
250-300m
(for RCS of 10dBsm with >90%detection probability)
30-100m
Range resolution
10-75cm
5-20cm
Range accuracy
±10 to ±40cm
±2 to 10cm
FOV azimuth
±9-15deg
±60-85deg
Azimuth resolution
1-3deg
3-9deg
Azimuth accuracy
±0.1-0.3deg
±0.3-5deg
Update rate
5 to 20 fps
20 to 50 fps
Max one-way velocity
±50m/s to ±70m/s
±30m/s
Velocity resolution
0.1 –0.6 m/s
0.1 - 0.6 m/s
Velocity accuracy
±0.03m/s to ±0.12 m/s
±0.03m/s to ±0.12 m/s
Although all vehicles on the road are expected to be equipped with NR radio, but not necessarily utilizing them for 100 percent of time. Enabling NR radio based sensing for ADAS can be a better utilization of the capability and radio resources. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.2 Pre-conditions | The 3GPP UE in the car has 3GPP subscription and is authorized by the operator to perform sensing. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.3 Service Flows | Figure 5.28.3-1 ADAS
1. Tom buys a new car with the latest ADAS equipped.
2. Tom wants to drive the car from home to the company in the morning of a working day. Tom drives from home to the road. The 3GPP NR based sensor in Tom’s car transmits the 3GPP NR signal to the other car(s) in the same road, and receives the reflected signal to detect the distance and speed of the other car(s) to feed to the ADAS in Tom’s car.
3. While the car is driving on the highway, suddenly a car is stopped before the Tom’s car due to an accident, fortunately it is timely detected by the NR based sensor.
4. The NR based sensors send the collision warning to the ADAS, the ADAS stops the car immediately.
5. Finally, Tom’s car avoids a collision accident and leaves the highway safely. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.4 Post-conditions | With the safely driving experience provided by ADAS, Tom arrives in the company safely and easily. Tom starts the daily work in the office. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.5 Existing features partly or fully covering the use case functionality | There are features of Sidelink positioning for the car moving along the LOS road (e.g., using Sidelink positioning for car ranging on the same road), which requires the participant cars are 3GPP UEs. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.28.6 Potential New Requirements needed to support the use case | [PR 5.28.6-1] The 5G system shall be able to configure and authorize UEs supporting V2X applications to perform sensing.
[PR 5.28.6-2] The 5G system shall be able to collect charging information for UEs supporting V2X applications when performing sensing.
[PR 5.28.6-3] The 5G system shall be able to support the following KPIs:
Table 5.28.6-1 KPIs for Vehicles Sensing for ADAS
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency[ms]
Refreshing rate [s]
Missed detection [%]
False alarm [%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
ADAS
[long range Radar]
Outdoor
[95]
[≤1.3]
NOTE 2
≤0.5
[≤ 0.12]
NOTE 4
N/A
[0.4]
NOTE 5
[≤ 0.6]
NOTE 4
[50]
[≤ 0.2]
[≤ 10]
[<1]
ADAS
[Short range Radar]
Indoor (parking space)/Outdoor
[95]
[≤2.6]
NOTE 3
≤0.5
[≤ 0.12]
NOTE 4
N/A
[0.4]
NOTE 5
[≤ 0.6]
NOTE 4
[20]
[≤ 0.05]
[≤ 10]
[<1]
NOTE 1: The terms in Table 5.28.6-1 are found in Section 3.1.
NOTE 2: Assuming typical max range of 250m, range accuracy of 10cm, azimuth accuracy of ±0.3deg. Positioning accuracy as (Min-Max) within field of view.
NOTE 3: Assuming typical max range of 30m, range accuracy of 2cm, azimuth accuracy of ±5deg.Positioning accuracy as (Min-Max) within field of view.
NOTE 4: Velocity accuracy and resolution is typically reported for the radial velocity (not absolute H/V velocity).
NOTE 5: Range resolution typically reported as 3D ranging distance accuracy. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29 Use case on Coarse Gesture Recognition for Application Navigation and Immersive Interaction | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.1 Description | As a new way of human-device interface, gesture recognition enables a more intuitive interaction between humans and machines, compared to the conventional text or GUI-based interfaces. Common applications of gesture recognition include touchless control of mobile devices, such as smartphones, laptops, and smart watches. Compared to other use cases, such as sports monitoring or sleep monitoring, gesture recognition requires higher resolution, higher update rate, and lower latency, which makes it more challenging in terms of resource utilization and processing complexity.
Gesture recognition identifies motions and postures of human body parts, such as head, hands, and fingers. As shown in Figure 5.29.1-1, gesture recognition can be applied to various applications such as human motion recognition, keystroke detection, sign language recognition and touchless control.
Gesture/motion/posture recognition
• Human motion recognition
• Keystroke detection
• Touchless control
• Sign language recognition
Figure 5.29.1-1 Gesture Recognition
In this use case, we focus on application of gesture recognition for touchless control and immersive (i.e. XR) application.
For touchless control, the identified gestures are then interpreted to specific behaviours or operations of the device, including locking/unlocking a screen, increasing/decreasing volume, and navigating forward/backward web pages.
For XR application, the position tracking and mapping of the human body is a basic requirement that permeates XR application to provide the immersive experience [55]. Hereby a variety of sensors are integrated in the XR devices to measure the movement of the human body (e.g., head, eye, hand, arm, etc) in order to simulate normal human mimicry. Besides, the hand tracking goes beyond the simulation and enables a natural and intuitive way of the interaction between the human and the machine compared with the use of the physical controller. The gesture recognition and hand tracking becomes a vital function in XR applications, especially when the human and the controlled object stay in physical and virtual world separately.
For both touchless control and XR application, NR-based RF sensing is suitable for gesture recognition because certain RF signals can detect coarse body movements and the RF signals are not susceptible to the ambient illumination condition and occlusions in the environment. In addition, NR-based RF sensing allows for coarse hand tracking in a lower-complexity and economic manner. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.2 Pre-conditions | There are two roommates, Jose and Bob, both of whom subscribed to MNO A, which has deployed RAN entity (e.g., an indoor base station) supporting NR-based sensing.
Jose subscribes to the touchless user interface service and his mobile device has NR sensing capability.
Bob subscribes to the immersive interaction service, provided by both MNO A and XR application (e.g. game, sports training) provider AppX on the access rights of the interaction information relevant to Bob’s hands. Bob’s UEs (e.g. smartphone, XR device such as headset) are capable of NR-based sensing and Bob’s XR device also have other sensors (e.g. IMU) embedded on the device. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.3 Service Flows | Social Media Navigation service with Gesture Recognition
Step 1: Sitting in this room, Jose is reading through social media posts using his smartphone, to navigate to either the previous or next posts. Jose waves his hand in the air from left to right or from right to left.
Step 2: Jose’s smartphone detects the hand gesture using 5G wireless sensing using the RAN entity, UE or both. The smartphone and RAN entity can send sensing data (with extracted gesture features such as range and Doppler of the detected gesture) to the 5G network.
Step 2b: Alternatively, the Jose’s smartphone detects the hand gesture using sensing signals and process those signals to generate 3GPP sensing data and sensing results. Also, the 3GPP sensing data can be further combined and processed with non-3GPP sensing data at the UE to generate a combined sensing result.
Step 3: 5G network then aggregates and processes the information collected from the UE and RAN entity to detect Jose’s gesture and provides the sensing results to Jose’s smartphone which is shared with the social media application and used for the navigation of the post.
Immersive interaction service with Gesture Recognition
Step 1: Bob launches the XR application in his room, at which moment one avatar is generated to represent him in the virtual world of the XR application, and the immersive interaction service is activated as shown in Fig 5.29.3-1.
Step 2: When Bob sees a basketball flying toward him, he catches the ball and throws it back. The characteristics of the gesture (e.g. range, doppler shift) are detected using NR sensing signals of UEs, the RAN entity or both.
Step 3: 5G network (e.g. 5GC) collects the 3GPP sensing data from UEs, RAN entity or both, process the data and exposes the sensing result (e.g. 3D position, velocity) to XR application. In parallel, the non-3GPP sensing data obtained from the XR device is transmitted to the application platform transparently through UE to 5GC.
Step 3b: Alternatively, 5G network can combine and process the 3GPP sensing data and non-3GPP sensing data obtained from XR device for the same posture, and then expose the combined sensing result(e.g. 3D position, velocity) with the contextual information(e.g. time) to XR application platform.
Step 4: The gesture and hand movement are recognized and the basketball bounces to another direction as a sensing result. The entire course will be presented on Bob’s headset as that the basketball is caught by one hand of Bob’s avatar and thrown back.
Figure 5.29.3-1 Hand Tracking in XR applications |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.4 Post-conditions | Due to the RF sensing capability in Jose’s mobile device and a nearby RAN entity, Jose’s gestures are detected and used to navigate the social media posts on his phone.
Similarly, due to the RF sensing capability in Bob’s smartphone, XR device and a nearby RAN entity, Bob’s coarse gesture and the motion of his hands will be recognized and tracked correctly. The avatar in XR application will show the correct gesture and execute the correct action triggered by the gesture. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.29.6 Potential New Requirements needed to support the use case | [PR 5.29.6-1] The 5G system shall be able to provide sensing with the following KPIs:
Table 5.29.6-1 Performance requirements of sensing results for gesture recognition
Scenario
Sensing service area
Confidence level [%]
Motion rate accuracy
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency[ms]
Refreshing rate [s]
Missed detection [%]
False alarm [%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Gesture recognition
Indoor
95
N/A
0.2
NOTES 4 and 5
0.2
NOTES 4 and 5
0.1
0.1
0.375
NOTES 1 and ,2 and 5
0.3
5 – 50
NOTE 3
≤0.1
≤5
≤5
NOTE 1: Due to the resolution allowed by the KPIs above, the use of pre-defined gestures for determining a user’s basic hand gestures is assumed.
NOTE 2: Assuming bandwith of 400 MHz, Tthe range resolution can beis determined using the formular C/2*B where C is the speed of light, B is the bandwidth (assuming B = 400 MHz which is supported in 5G mmWave networks.)
NOTE 3: The value is derived from TS22.261 [33] clause 7.11 about max allowed end-to-end latency for immersive multi-modal KPIs.
NOTE 4: Positioning accuracy KPIs are based on minimum supported positioning accuracy in 5G systems [TS 22.261]
NOTE 5: By combining non-3GPP sensing data (if available) with 3GPP sensing data, some of these KPIs may be improved. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30 Use case on sensing for automotive manoeuvring and navigation service when not served by RAN | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30.1 Description | Consider the scenario defined in section 5.8, where NR wireless sensing is utilized to assist automotive manoeuvring, i.e. sensing results play an important role in making the manoeuvring decisions. However, in this section the vehicles are not served by RAN when the Sensing activity is expected to occur, where UE is not served by RAN and therefore RAN entities are unable to be involved. UEs performing the sensing measurement process in this scenario have to be able to operate when UE is not served by RAN.
5.30.2 Pre-conditions
Refer to 5.8, where Bob’s vehicle determines the need for sensing service. Bob’s vehicle integrates a UE supporting V2X application i.e., Bob’s vehicle supports V2X. Sensing can be performed by 5G Wireless sensing. Unlike in section 5.8, the vehicles are not served by RAN and perform 5G Wireless sensing without help of the network entities. Bob’s vehicle has been provisioned by his home operator to be able to perform 5G Wireless sensing when not served by RAN. There may be other vehicles or RSU (road side unit) helping the 5G Wireless sensing of the Bob’s vehicle. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30.3 Service Flows | 1. Bob is driving from urban to rural countryside. As his vehicle is operational, in motion, and attempting to assist his driving using ADS, Bob’s vehicle is performing sensing using 5G Wireless sensing.
2.Bob drives his vehicle outside the coverage area of its mobile network. The 5G Wireless sensing continues providing assistance to Bob’s driving.
3. Bob’s vehicle approaches Joe’s vehicle.
4. Bob’s vehicle becomes aware of Joe’s vehicle by Bob’s onboard Sensing receiver(s) receiving 5G Wireless sensing reflected by Joe’s vehicle.
5. Bob’s vehicle applies this new object information to the local ADS function to ensure a safe, collision-free, drive. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30.4 Post-conditions | Bob’s vehicle is able to drive with high reliability by utilizing 5G sensing service when not served by RAN. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.30.6 Potential New Requirements needed to support the use case | [PR 5.30.6-1] The 5G system shall be able to provide mechanisms for an MNO to configure UEs supporting V2X application for 5G Wireless sensing operation when not served by RAN.
[PR 5.30.6-2] Subject to regulation, the 5G system shall enable UEs supporting V2X application to perform 5G Wireless sensing when not served by RAN using the allowed ITS spectrum and unlicensed spectrum. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31 Use case on blind spot detection | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.1 Description | Blind spot detection reduces the risk of accidents during lane changes by monitoring the dangerous blind spot area [26]. The blind spot area is a typically a moving target area that changes when car moves if we take the road infrastructure as reference point. Currently, the blind spot detection system operates via a variety of external sensors located on a car’s bumpers and wing mirrors, which can detect if a person or vehicle enters your blind spot, notifying you via an audible or visual cue - typically, a warning light located in the car’s wing mirror.
Figure 5.31.1-1: blind spot detection system, a moving target area.
Wireless sensing technology can be utilized to detect obstacles that presents in car’s blind spot area:
• Case I: Base stations on the roadside are already used to provide 5G coverage for communication, and the radio signals that are reflected can be used to sense the blind spot area of a car.
• Case II: Base stations on the roadside are already used to provide 5G coverage for communication, and the radio signals that are received by the UE (i.e. the car is a 3GPP UE, or there is a 3GPP UE such as smartphone on the car) can be used to sense the blind spot area of the car.
Any obstacle that presents in the car’s blind spot area, no matter the obstacle is moving (e.g. car, motorcycle, walking human, animal etc) or static (building, tree), will affect the reflected/received signals. By deriving the characteristics of the affected signals, obstacle can be detected, and danger can be avoided. It is convenient to utilize current deployed 5G network system to achieve this blind spot detection.
This use case is to reuse 5.8 to describe the blind spot detection aspects. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.2 Pre-conditions | MNO provides blind spot detection sensing service to different kinds of subscribers:
- Bob’s car is a 5G UE and subscribes to this sensing service. His car has NR-based sensing technology and capabilities such as NR-based sensing capabilities, sensing processing capabilities are also provided to the MNO.
- Juan’s car is not a 5G UE, but his 5G UE (smartphone) subscribes to this sensing service, where an application is installed on the 5G UE.
- Alex’s car is not a 5G UE, an application installed on his car subscribes to this sensing service.
Laura has no subscription to the blind spot detection sensing service. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.3 Service Flows | Step 1: Bob, Juan, Alex and Laura are friends and driving together to Alps skiing resort. At 8:00am, Bob starts from Street A, Juan and Alex start from Street B and Laura start cars from Street C. Bob, Juan and Alex trigger the blind spot detection sensing service separately.
Step 2: When received the service request, 5G system discovers and configures sensing transmitter(s) and sensing receiver(s) to track and monitor the moving car’s blind spot area (from sensing transmitter’s perspective), e.g.:
• Bob’s car is a 5G UE and has NR-based sensing capabilities, 5G system configures base station(s) as sensing transmitter and Bob’s car as the sensing receiver. The moving blind spot area is tracked by Bob’s 5G UE.
• Juan’s car and Alex’s car are not 5G UE. 5G system configures base station(s) as sensing transmitter and sensing receiver. Juan’s car’s moving blind spot area and Alex’s car’s moving blind spot area are separately tracked by 5G base station.
Step 3: 3GPP sensing data is collected by sensing receiver and transferred to the network sensing processing entity to derive the sensing result, which is then exposed to the service consumer to fetch out whether there is obstacle presence in the blind spot area, e.g.
• Bob’s car is a 5G V2X UE and has processing capabilities, 5G system authorizes 5G UE as sensing processing entity.
• Juan’s car is not a 5G V2X UE, but Juan has his smartphone (5G UE) carried in car, 5G system authorizes 5G UE as sensing processing entity.
• Alex has no 5G UE on board, 5G network processes the 3GPP sensing data to derive sensing result.
Step 4: A car is moving very fast from Street A to Street B to Street C and presents sequentially in Bob’s, Juan’s Alex’s and Laura’s blind spot area.
• Bob is changing lane, the moving car is detected and Bob safely changed lane.
• Juan and Alex turning left, the moving car is detected and they safely turned right.
• Laura is overtaking a truck, the moving car suddenly presents in the blind spot area, Laura forgets the over-shoulder view and hits the moving car. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.4 Post-conditions | Bob, Juan and Alex drive safely to the Alps skiing resort and enjoy their holiday thanks to the blind spot detection sensing service.
Laura is in hospital. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.5 Existing features partly or fully covering the use case functionality | None. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.31.6 Potential New Requirements needed to support the use case | [PR 5.31.6-1] The 5G System shall be able to provide sensing service to track a moving target sensing service area. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32 Use case of integrated sensing and positioning in factory hall | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32.1 Description | Autonomous Mobile Robots (AMR)s and automated guided vehicle (AGV) are enabling solutions for a smart factory environment, in which a diversity of logistic tasks are done with an autonomous and efficient implementation, with minimal direct human engagement. In order to achieve a safe and efficient operation to serve a desired goal (e.g., transfer of construction materials with minimal risk, delay and energy consumption), a command center may take over the task of collecting the information of the involved facilities and performing a coordinated planning of the device operations. Nevertheless, a safe and efficient operation of the mobile devices may be only achieved on the condition of an accurate awareness of the environment (e.g., obstacles, humans) and the device positioning/tracking information (e.g., AGV/AMR position and velocity).
In view of the above, the 5G system shall serve the implementation of a smart factory by means of a reliable data connectivity (e.g., the low-latency and reliable AGV/AMR to command center connection) and positioning of the involved AGV/AMR UE devices. Furthermore, the 5G system sensing services can be utilized to augment the environment awareness by means of detecting and locating the non-connected objects (e.g., obstacles such as trash box, or safety-sensitive objects such as human, etc.).
In addition to the detection of the non-connected objects, the 5G system sensing enables a higher positioning and tracking accuracy of a target UE at the 5G system, by means of augmenting the positioning and sensing capabilities [27]. In case of an AMR/AGV, the positioning measurement of a device can be augmented at the 5G system with the 3GPP sensing data obtained from the reflections of the sensing signal from the AMR/AGV physical body, in the interest of a higher environment awareness and positioning accuracy, as well as the additional information obtained via sensing of the AMR/AGVs (e.g., orientation of an AMR). In particular, when both 5G system sensing and positioning services are activated, the same 5G system nodes (e.g., sensing Tx nodes) and 5G system signals (e.g., positioning or sensing signals) can be reused to efficiently generate and process the desired sensing and positioning measurements, see Figure 5.32.1-1 as an example of a joint sensing and positioning of a UE.
Figure 5.32.1-1 The 5G system obtains position estimate of a UE, utilizing 3GPP sensing data of the UE obtained from the 5G wireless sensing service, in combination with the positioning measurement of the UE
The obtained higher positioning accuracy of an AGV/AMR is particularly valuable for coordination of multiple AGVs and AMR with indeterministic movement paths, wherein the situations involving sudden break and/or velocity change may lead to a high delay, damage risk, and interruption energy loss.
5.32.2 Pre-conditions
Multiple AMR/AGVs are deployed in a factory hall belonging to the company X. The AMR/AGVs are coordinated by a command center to perform a collaborative construction task. The command center coordinates the AMR/AGVs’ movements to improve safety and to avoid energy loss (due to an AMR/AGV break) and delay as much as possible.
To facilitate this, the factory hall is equipped with the 5G system sensing services provided by the Mobile Network Operator (MNO) A. Moreover, the AMR/AGVs are equipped with the 5G system communication and positioning modules.
The company X has provided the MNO with the physical type/characteristics of the deployed AMR/AGVs, as well as the installed camera data of the factory hall to assist detection and positioning of the AMR/AGVs via sensing. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32.3 Service Flows | Step 1: [AMR/AGV is deployed to deliver goods]
AMR/AGV Y is assigned with a task for delivering needed material to a construction site within the factory hall. The AMR/AGV is loaded with the materials and departs from its initial location.
Step 2: [AMR position is obtained via 5G system positioning]
AMR/AGV moves from its initial position towards the construction site. The position information of the AMR/AGV is obtained by the MNO A via the 5G system positioning module of the AMR/AGV and reported to the command center.
The command center determines that the provided positioning accuracy of the MNO A is sufficient since the AMR/AGV currently moves in the low-traffic area of the factory hall. Based on the received positioning information of the AMR/AGV, the command center recommends that AMR/AGV keeps its velocity towards the construction site.
Step 3: [AMR/AGV position is obtained via a joint 5G system sensing and positioning]
As the AMR/AGV moves towards the construction site, more objects (other AMR/AGVs, humans, tools) appear in the vicinity. The command center identifies that the AMR/AGV is now in a high-traffic area and a higher positioning accuracy is desired.
The command center requests the MNO A to activate 5G system sensing service during the 5G system positioning service for positioning of the AMR/AGV UEs.
MNO A activates sensing of the desired AMR/AGV areas to enhance positioning of the AMR/AGV devices. MNO A identifies UE and/or gNBs capable of sensing in the vicinity of the AMR/AGV and starts sensing measurement process jointly with the 5G positioning measurements of the AMR/AGVs. The 5G network obtains 3GPP sensing data of the identified nodes and generates a high accuracy position estimate and/or additional sensing information of the AMR/AGVs, based on the collected 3GPP sensing data in addition to the AMR/AGV’s positioning measurements. The obtained positioning estimate of the AMR/AGV is reported to the command center.
Based on the obtained high-accuracy positioning information of the AMR/AGVs the command center adjusts the velocity of the AMR/AGVs.
Step 4: [AMR reaches the construction site]
The AMR/AGV reaches the construction site and offloads the goods. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32.4 Post-conditions | Thanks to the 5G system based sensing service enabling an enhanced AMR/AGV positioning and sensing of the environment, the involved AMR/AGVs are coordinated to arrive at their destination with minimal risk and interruption loss. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32.5 Existing feature partly or fully covering use case functionality | A UE equipped with 5G positioning module may obtain positioning information of the UE based on the 5G positioning services. Moreover, the 5G system sensing services shall support detection and positioning of an object. Nevertheless, interpretation of an object’s position as a UE’s position (e.g., among multiple detected objects), as well as a joint positioning and sensing of a UE device as a physical object by the 5G system (when both 5G system services are available to the UE) has not been enabled by the current requirements. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 5.32.6 Potential New Requirements needed to support the use case | [PR 5.32.6-1] Based on operator’s policy, the 5G system may provide a mechanism for a trusted third party to provide sensing assistance information about a sensing target.
[PR 5.32.6-2] The 5G system shall be able to provide sensing results with the following KPIs:
Table 5.32.6-1 Performance requirements of the sensing results exposed to the third party
Scenario
Sensing service area
Confidence level [%]
Accuracy of positioning estimate by sensing (for a target confidence level)
Accuracy of velocity estimate by sensing (for a target confidence level)
Sensing resolution
Max sensing service latency
[ms]
Refreshing rate
[s]
Missed detection
[%]
False alarm
[%]
Horizontal
[m]
Vertical
[m]
Horizontal
[m/s]
Vertical
[m/s]
Range resolution
[m]
Velocity resolution (horizontal/ vertical)
[m/s x m/s]
Indoor Factory
100 m2
99
[≤0.5]
N/A
0.5
N/A
[0.5]
[0.5]
≤100
0.1
N/A
N/A
NOTE: The terms in Table 5.32.6-1 are found in Section 3.1. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6 Considerations | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6.1 Considerations on confidentiality, integrity and privacy | |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6.1.1 General | When introducing sensing technology, new aspects on confidentiality, integrity, and privacy need to be considered, to ensure that these aspects are considered already when proposing service requirements.
For instance, with sensing technology by-standers can be affected in a completely new way, previously only UEs have been able to be tracked but now sensing capabilities may enable tracing and potentially identification of anything in the environment, including humans that do not carry a UE, or any objects. This has implications for privacy. Obviously humans should have a right to privacy.
For privately owned areas, respective permission is required for sensing operation from such as the homeowner for in-home sensing or the building management for the in-building sensing.
For public areas, such as a public road, park and airport, it is required to obtain the permission of the respective public area management.
It is important to have the user consent before the network uses UEs in providing sensing service. If the sensing results and user ID are brought together for further processing, user consent is also needed.
Of course, factors such as resolution, updating frequency, and type of application influence the security implications.
Requirements to minimize the risk of unwanted usage and awareness of the usage needs to be considered in stage 1. These are captured in the next chapter. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6.1.2 Potential New Requirements | A set of general new requirements can be identified:
[PR 6.1.2-1] The 5G system shall limit sending the sensing results only to third party authorized to receive that sensing results.
[PR 6.1.2-2] The 5G system shall support encryption and integrity protection of the sensing result, to protect the data inside the 5G system and when used.
[PR 6.1.2-3] The 5G system shall support appropriate level of sensing for both situations where consent can be obtained from the sensing targets, and where it cannot.
[PR 6.1.2-4] Subject to regulation, the 5G system shall obtain user consent when sensing results and user identification are brought together for further processing.
6.2 Considerations on Regulatory, Mission Critical and other priority services |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6.2.1 General | The sensing operation in Operator’s network can support commercial services (e.g. use case described in section 5.8 on sensing assisted automotive manoeuvring and navigation). There could be areas where the network resources are limited and prioritization (according to operator’s decision) would be needed among the resources used for sensing service and resources used for other services (e.g. communication service).
In addition, sensing operation could also be used to support services such as MCS (e.g. public safety, Utilities, Railways) and MPS with requirements for priority treatment. The 5G system can provide flexible means for priority treatment to the users of sensing services subject to regional/national regulatory rules and operator policy. |
e8cee4e428329a7668584ba76bf8de13 | 22.837 | 6.2.2 Potential New Requirements | [PR 6.2.2-1] Subject to regulation and operator’s policy, 5G system shall provide prioritization among sensing services. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.