hash
stringlengths 32
32
| doc_id
stringlengths 5
12
| section
stringlengths 4
595
| content
stringlengths 0
6.67M
|
---|---|---|---|
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4 Use Case on International Roaming Users in a Shared Network | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.1 Description | There are network sharing scenarios in the process of 5G deployment due to different operators’ business consideration, such as network planning, operation and other factors, which introduce demand for 5G network sharing with indirect connection between the shared access network and a participating operator’s core network, especially in countries with a wide range of rural areas, where seamless 5G radio coverage is hard to achieve. Benefit from 5G network sharing technology, an operator can not only save investment for 5G deployment, but also expand 5G coverage in vaster area as well. However, in this case, it is also required to allow inbound international roaming subscribers to use the shared resources of a hosting operator. Therefore, it is suggested investigating how to realize the scenario in this situation. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.2 Pre-conditions | In the 5G scenario, an operator may deploy its network as an applicable shared network to other operators.
As illustrated in Figure 5.4.2-1, three operators (OP1, OP2, OP3) are involved:
- OP1, as a Hosting NG-RAN Operator, owns an applicable Shared NG-RAN which can be shared to other operators (e.g., OP2).
- OP2, as a Participating NG-RAN Operator, has 5G network sharing agreement with OP1. In addition, OP2 may have its own non-shared 5G network.
- Shared NG-RAN of OP1 does not have direct connections between the shared access and the core networks of the participating operators OP2.
- OP3 is a roaming partner of OP2, but has no roaming agreement with OP1.
- UE subscribes to OP3’s network as a foreign operator’s network.
Figure 5.4.2-1: Scenario of International Roaming Users in a Shared Network |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.3 Service Flows | When a UE of OP3 travels to the coverage of OP2’s NG-RAN, it can enjoy services provided by their home environment in another country via OP2’s network.
When a UE of OP3 travels to the coverage of OP1’s shared NG-RAN, it can enjoy services provided by their home environment in another country via OP1’s shared access network and OP2’s network. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.4 Post-conditions | It is the same for the service experience and treatment of OP2’s UE and OP3’s UE in the coverage of OP1’s Shared NG-RAN, based on the agreement between operators. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.5 Existing Features partly or fully covering Use Case Functionality | 3GPP TS 22.011 [6] introduces requirements on roaming in shared networks, as described below:
The following requirements are applicable to GERAN, UTRAN, E-UTRAN and NG-RAN sharing scenarios:
- When network sharing exists between different operators and a user roams into the shared network it shall be possible for that user to register with a core network operator (among the network sharing partners) that the user’s home operator has a roaming agreement with, even if the operator is not operating a radio access network in that area.
Other roaming related requirements are described in TS 22.101 [2]:
3GPP specifications should be in compliance with the following objectives:
e) to provide support of roaming users by enabling users to access services provided by their home environment in the same way even when roaming. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.4.6 Potential New Requirements needed to support the Use Case | [PR 5.4.6-001] The 5G system shall enable Shared NG-RAN of a Hosting Operator with the indirect connection between Shared NG-RAN and a Participating NG-RAN Operator’s core network to provide services for inbound roaming users.
NOTE: Inbound roaming users mentioned above refer to the subscribers of a foreign operator having a roaming agreement with one participating operator.
5.5 Use Case on Hosted Services |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.1 Description | Subscribers have the opportunity to visit the Hosted Services via its own operator's network, specified in TS 22.261[3]. These Hosted Services contain applications provided by operators and/or trusted 3rd parties. Usually, operators use some technologies to optimize the path of the user plane to reduce latency and bandwidth pressure, if the Hosted Services run on the operator's Service Hosting Environment.
GSMA provides a scenario when Hosted Services of party operator is located at a shared edge node of hosting operator, the Hosted Services are also applicable for the users accessing through the PLMN of party operator. The sharing edge node means that hosting operator deploys applications on cloud resources in its edge node as requested by a party operator and provides an application endpoint to the party operator, referred to S2-2208120.
However, when users access through a shared network, it is still expected the Hosted Services to be available. This use case provides the possibility for subscribers visiting Hosted Services via shared networks. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.2 Pre-conditions | Assumptions,
1. OP1 is a Hosting NG-RAN Operator.
2. The core network of OP2, as the participating operator, does not have direct connection with OP1’s shared NG-RAN.
3. There is connection between the OP1’s CN and OP2’s CN.
4. OP2 provides Hosted Services for subscribers in the local area 1 and 2. OP2 wishes to provide Hosted Services for subscribers accessing through OP1’s and OP2’s network in the local area 1. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.3 Service Flows | Figure 5.5.3-1: Scenario of visiting Hosted Services via shared network
1. UE is the subscriber of OP2, and enjoy the Hosted Services of OP2 in the local area 2.
2. UE is ready to move to the shared network of OP1. During this process, the Service Hosting Environment hosting the service may or may not change.
3. Available Service Hosting Environment can still be found in local area 1 to provide services for UE. The Hosted Services are applicable when the UE crosses the local area 1 and 2 and/or stay within area 1. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.4 Post-conditions | The visiting hosted service via shared network succeeds, when UE moves between shared access network and participating operator’s network. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.5 Existing Features partly or fully covering Use Case Functionality | Session 6.5 of TS 22.261 mainly introduces the relevant terms and related requirements of Hosted Services. Some typical requirements for service in network sharing are:
5G is designed to meet diverse services with different and enhanced performances (e.g. high throughput, low latency and massive connections) and data traffic model (e.g. IP data traffic, non-IP data traffic, short data bursts and high throughput data transmissions).
User plane should be more efficient for 5G to support differentiated requirements. On one hand, a Service Hosting Environment located inside of operator's network can offer Hosted Services closer to the end user to meet localization requirement like low latency, low bandwidth pressure. These Hosted Services contain applications provided by operators and/or trusted 3rd parties. On the other hand, user plane paths can be selected or changed to improve the user experience or reduce the bandwidth pressure, when a UE or application changes location during an active communication.
In addition, Chapter 4.23.9 of 23.502[7] provides some further work related with efficient user plane, which mainly describe additional PDU Session Anchor and Branching Point or UL CL controlled by I-SMF. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.5.6 Potential New Requirements needed to support the Use Case | [PR 5.5.6-001] In case of Indirect Network Sharing, the 5G system shall support a mechanism to enable a UE subscribed to the participating operator to access the participating operator’s Hosted Services.
5.6 Use Case on Emergency Call |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.1 Description | The network is allowed to route the emergency call to the emergency response centre in the area where the operator provides services, as specified in TS 22.101[2], in accordance with national regulations and where the subscriber is located.
In order to route emergency calls to the correct emergency response centre, operators may need to use the UE location information. Operators may use different location information based on the requirement of the national regulation. Sometimes operators have to change the network configuration to provide correct routing or services related to location information in an area.
This use case provides the possibility that the subscriber of the participating operator may initiate an emergency call in the shared network, which leads to the problem to the service provider of routing the emergency call according to the location information of the hosting operator. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.2 Pre-conditions | Assumptions:
1. OP-1 is a Hosting NG-RAN Operator. There is a connection between the OP-1’s CN and OP-2’s CN, while OP-2 as a participating operator indirectly sharing the network.
2. OP-2, as the participating operator, is an emergency call service provider. The network of OP-2 is connected to the emergency response centre located in Area 1.
3. OP-3 is the Hosting RAN operator of NG-RAN and E-UTRAN. And OP-3 and OP-2 have a sharing agreement with MOCN. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.3 Service Flows | Figure 5.6.3-1: Emergency call routing in network sharing scenarios
1. UE is registered to OP-2. UE accesses the shared network of OP-1, and registered with the OP-2’s IMS. The nearest emergency response centre to UE's current geographic location is in Area 1, shown as Fig. 5.6.3-1.
2. UE initiates an emergency call in the shared network, it is most reasonable for the core network of OP-2 to route the emergency call to PSAP within area 1.
3. OP-2’s network routes the emergency call to PSAP in Area 1 according to the location information transmitted by the shared network. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.4 Post-conditions | The emergency call can be correctly routed to the appropriate PSAP to serve the UE according to its location information. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.5 Existing Features partly or fully covering Use Case Functionality | Chapter 10 of TS 22.101[2] defines the emergency call service. According to National regulations, the requirement of UE’s location of emergency call service is different in deployment scenarios, examples are:
National regulations may require wireless networks to provide the emergency caller's location. This requirement typically overrides the caller's right to privacy with respect to their location being revealed, but remains in effect only as long as the authorities need to determine the caller's location. The interpretation of the duration of this privacy override may also be different, subject to national regulation. For example, some countries require location to be available from the wireless network only while the call is up, while others may allow PSAP's to unilaterally decide how long the location must be made available.
Therefore, the requirement for providing location availability is to allow the network to support providing a mobile caller's location to the authorities for as long as required by the national regulation in force for that network.
Note: See TS 22.071 [8] for location service requirements on emergency calls. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.6.6 Potential New Requirements needed to support the Use Case | [PR 5.6.6-001] Subject to national or regional regulatory and operator policies, the 5G system shall support emergency call from a UE accessing a Shared NG-RAN via Indirect Network Sharing.
[PR 5.6.6-002] Subject to national or regional regulatory and operator policies, the 5G system shall provide to the Participating Operator the location information of an emergency caller accessing a Shared NG-RAN via Indirect Network Sharing. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7 Use Case on Long-distance Mobility in and across Shared Networks | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.1 Description | Despite the aircraft and rail have been developed to serve more and more regions, road transport remains the most important way to deliver goods. As the logistics industry is booming, the total length of long-distance road transport has increased every year in the last ten years. How to provide communication service during long-distance transport is valuable to be discussed serving the transportation and logistics customers. Network sharing offers a way for the network operator to share the heavy deployment of mobile networks, as well as improve the service availability and user experience along the transport routes.
Compared with MOCN configuration, network sharing with the indirect connection between Shared NG-RAN and the core network of participating operators allows flexible access options for the sharing parties to consider. For example, one network operator needs to provide communication service along a dedicated transport route across dense urban, rural, and desert, but can’t find one hosting operator with full coverage of the whole route. It’s possible to cooperate with a terrestrial network operator and a 3GPP satellite network operator separately to ensure the full coverage. Also, with the agreement, it’s possible to leverage the temporary coverage enhancement plan of hosting operators for certain regions, such as deploying onboard base stations in dense urban, to provide better user experience to their own users.
Figure 5.7.1-1 Desert Highway across shared networks [9] |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.2 Pre-conditions | It is assumed that network operators OP1, OP2, and OP3 have deployed 5G networks respectively in a country. Their radio access network can provide coverage to different regions of the country, and together cover the entire country with overlapping in certain regions.
OP1 and OP3 are agreed with OP2 to share radio access network in the planned geographic area via the indirect connection between their shared radio access network and OP2’s core network.
- OP1 is a Hosting NG-RAN Operator, who has deployed different types of 5G access nodes (e.g., Macro, Micro, onboarding, and IAB). It only shares Macro base stations (e.g., 5G BS#A, BS#B, BS#C) with participating operator OP2 but allows to access to dedicated type of access nodes (e.g., onboarding) temporarily regarding the agreement and the policy.
- OP3 is another Hosting RAN Operator, sharing 3GPP Satellite access nodes with OP2.
- OP2 has non-shared 5G access network deployed in some regions of the country, as 4G&5G BS.
- UE on the vehicle supports all 5G RATs and has the subscription with OP2’s network.
Figure 5.7.2-1: Long-distance mobility in and across shared networks |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.3 Service Flows | 1. UE on the vehicle connects to OP2’s network via OP1 shared access nodes and is authorized to initiate a communication service (e.g., data transfer, IMS voice call) during the movement.
2. In geographic Area #A, UE connects to OP2’s network only through the access nodes like 5G BS #A based on the agreement and the policies of OP1 and OP2.
3. A regional earthquake happens and results in unstable performance of 5G BS #B in Area #B. OP1’s UAV base stations are shared to OP2 as a supplementary or an alternative access node based on OP1 and OP2 agreement.
4. When UE reaches Location Loc #A, where it can detect the radio signal of both 5G BS #B and UAV base station, UE will access a proper OP1 access node (e.g., 5G UAV BS ) to continue the service regarding the subscription, and the policy and agreement of OP1 and OP2.
5. When UE reaches Location Loc #B, where it can detect the radio signal of OP1 5G BS#C and OP3 Satellite access node, UE will access a proper Hosting RAN operator’s access node (e.g., OP3 Satellite) to continue the service regarding the subscription and the policy and agreement of OP1, OP3 and OP2.
6. When UE reaches Location Loc #C, where it can detect the radio signal of OP3 Satellite access node and OP2 4G&5G BS, UE will select OP2 5G BS to continue the service regarding the policies of OP3 and OP2. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.4 Post-conditions | UE can get services with the displayed serving network as OP2’s network when move in OP1’s Shared NG-RAN, across OP1’s and OP3’s shared RANs, and across OP3’s shared RAN and OP2’s non-shared RAN.
OP2 ensures service continuity with proper mobility and access control to the shared access nodes of the same or different Host RAN Operators. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.5 Existing Features partly or fully covering Use Case Functionality | The normative stage1 requirements of network sharing has been introduced to TS22.101 [2] and TS 22.261 [3] in previous releases. TS 22.101[2] specifies the service principle about Network Sharing that
It shall be possible to support different mobility management rules, service capabilities and access rights as a function of the home PLMN of the subscribers. [clause 4.9]
The 3GPP System shall support a Shared RAN (E-UTRAN or NG-RAN) to cover a specific coverage area (e.g. from a complete network to a single cell, both outdoor and indoor). [clause 28.2.1]
The network sharing has been extended to different types of access networks, as stated TS 22.261 [3].
A 5G satellite access network shall support NG-RAN sharing.
TS 22.261[3] has also illustrated general requirements about access, the mobility management and service continuity between different access networks as below.
Based on operator policy, the 5G system shall support steering a UE to select certain 3GPP access network(s). [clause 6.3.2]
The 5G system shall support inter- and/or intra- access technology mobility procedures within 5GS with minimum impact to the user experience (e.g., QoS, QoE). [clause 6.2.2]
The 5G system shall support service continuity between 5G terrestrial access network and 5G satellite access networks owned by the same operator or owned by different operators having an agreement. [clause 6.2.3]
In MOCN configuration, that multiple shared access networks have direct connections to the Participating Operator’s core network, the access control and mobility management are performed by Participating Operator’s core network directly.
However, in Indirect Network Sharing, besides Participating Operator’s core network, Host RAN Operator’s core network may be involved in the access control and mobility management. Also, UE may connect to Participating Operator’s core network via the shared access nodes of different Host RAN Operators.
TS 22.261[3] has defined requirements about multi-network connectivity and service delivery across operators as below.
To provide a better user experience for their subscribers with UEs capable of simultaneous network access, network operators could contemplate a variety of sharing business models and partnership with other network and service providers to enable its subscribers to access all services via multiple networks simultaneously, and with minimum interruption when moving.
For a user with a single operator subscription, the use of multiple serving networks operated by different operators shall be under the control of the home operator. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.7.6 Potential New Requirements needed to support the use case | [PR 5.7.6-001] Subject to the agreement between Hosting Operators and Participating Operator, the 5G system shall support a mechanism to authorize a UE with the subscription to the Participating Operator to obtain services in a Shared NG-RAN of Hosting Operators using Indirect Network Sharing.
[PR 5.7.6-002] In case of Indirect Network Sharing, based on the Hosting and Participating Operators’ policies, the 5G system shall enable the Participating Operator to provide steering information to the Hosting Operator.
NOTE 1: Such steering information for a given UE of the Participating Operator could be applied by the Hosting Operator to select amongst the Hosting Operator’s available shared access network(s).
[PR 5.7.6-003] In case of Indirect Network Sharing, the 5G system shall be able to apply different access control for different access networks of Shared NG-RANs based on the Hosting and Participating Operator’s agreement and policies. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8 Use Case on Support of PWS in Shared NG-RAN in Indirect Interconnection with the Participating Operators | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.1 Description | Operators 1 and 2, both licensed to operate a 4G network and a 5G network share the 5G part of the radio access network in some areas of the country.
The Public Safety Agency sends the PWS message in a certain area covered by NG-RAN1 and RAN2 of Operator 1 and Operator 2 respectively and by the shared NG-RAN area of Operator 1 and Operator 2. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.2 Pre-conditions | • Operator 1 owns NG-RAN 1 which is not shared.
• Operator 2 owns RAN 2 which is not shared.
• NG-RAN is a shared NG-RAN between Operator 1 and Operator 2.
• Operator 1 is the hosting operator.
• Operator 2 is a participating operator. Operator 1 and Operator 2 share NG-RAN in an indirect sharing method.
• Operator 1 has a regulatory obligation to broadcast PWS message to all UEs operating on NG-RAN 1 and NG-RAN.
• Operator 2 has a regulatory obligation to broadcast PWS message to all UEs operating on NG-RAN and RAN 2.
• UEs operating in NG-RAN 1 or RAN 2 may handover to NG-RAN during the period the PWS message is periodically rebroadcasting.
• UEs subscribed to Operator 1 operating in NG-RAN may handover to NG-RAN 1 during the period the PWS message is periodically rebroadcasting (and vice versa).
• UEs subscribed to Operator 2 operating in NG-RAN may handover to RAN 2 during the period the PWS message is periodically rebroadcasting (and vice versa).
Figure 5.8.2-1 PWS Scenario in Indirect Network Sharing |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.3 Service flows | 1) The Public Safety Agency (PSA) sends the PWS message in a certain Shared NG-RAN radio coverage area designated by the PSA where Operator 1 and Operator 2 share the RAN access.
2) Operator 1 and Operator 2 both broadcast the PWS message in the area covered by their respective not shared RAN they have regulatory obligations (NG-RAN 1 and RAN 2).
3) The UEs served by NG-RAN 1 and RAN 2 receive the warning message from their corresponding operating radio access.
4) The UEs served by Shared NG-RAN of the hosting Operator 1 receive the public warning message created for the specific region which is covered by NG-RAN. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.4 Post-conditions | For UEs subscribed to unshared accesses NG-RAN 1 and RAN 2, the public warning message can only be received by UEs subscribed to these operators owning the access.
UEs served by Shared NG-RAN receive the public warning message issued by the PSA in the radio coverage area of the Shared NG-RAN designated for the warning message.
The UEs only display one copy of the PWS message according to and regardless of the RAN it is operating in or may handover to.
The UEs present new content of public warning messages after entering and leaving the shared network. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.5 Existing requirements to PWS Support in Shared Networks | See TS 22.101 [2].
28.2.5 PWS support of Shared E-UTRAN/NG-RAN
A Participating Operator potentially has regulatory obligations to initiate the broadcast of PWS messages regardless of E-UTRAN/NG-RAN sharing.
The following requirements apply:
The shared E-UTRAN/NG-RAN shall be able to broadcast PWS messages originated from the core networks of all Participating Operators.
NOTE 1: Rel-11 design requires a shared PWS core. However, some regulatory obligations require a solution in which no common PWS core network entity is involved.
28.3.5 PWS support of Shared GERAN or UTRAN
A Participating Operator potentially has regulatory obligations to initiate the broadcast of PWS messages regardless of GERAN or UTRAN sharing.
The following requirements apply:
The shared GERAN or UTRAN shall be able to broadcast PWS.
NOTE 2: The Hosting RAN Operator is responsible for the delivery of PWS messages to the UEs.
Identifying, changing, and adding appropriate functionality in the network will definitely lead to a better shared-network operation. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 5.8.6 Potential New Requirements needed to support the Use Case | [PR 5.8.6-001] Subject to regulatory requirements and mutual agreement between the Participating Operators and the Hosting Operator, the 5G system shall be able to support PWS when connectivity is provided via Indirect Network Sharing.
[PR 5.8.6-002] Subject to regulatory requirements and mutual agreement between the Participating Operators and the Hosting Operator, Shared NG-RAN in Indirect Network Sharing shall be able to broadcast PWS messages originated from the core network of the Hosting Operator. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 6 Other considerations | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 6.1 Considerations on security | Different from MOCN configuration, which routes UE to the core network of a participating operator directly through Shared RAN of hosting operator, Indirect Network Sharing needs to involve the core network of the Hosting NG-RAN Operator in the signaling exchange. The interoperability between the core networks of Hosting NG-RAN Operators and participating operator will expose more information about the subscribers, the operator’s policies and operations of respective core networks, especially those irrelevant to Indirect Network Sharing. Therefore, more security relative to user privacy and the operator’s policy can be taken into account for the Indirect Network Sharing configuration. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7 Consolidated requirements | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.1 Introduction | Potential requirements on Indirect Network Sharing have been identified in Section 5 out of the use cases under the following aspects:
- General
- Mobility
- Network access control
- Regulatory services
- Charging
These requirements have been especially introduced to support the newly defined network sharing scenario, where a NG-RAN is shared among multiple operators without necessarily assuming a direct connection between the shared NG-RAN and the Participating NG-RAN Operator’s core network.
NOTE: These enhanced requirements may go beyond existing RAN sharing requirements in Rel-13. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2 Consolidated Potential Requirements | |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.1 Description | Indirect Network Sharing is another type of NG-RAN sharing, to facilitate the operators to extend the availability of 5G services to more coverage areas for their subscribers. It can allow NG-RAN resources to be used by two or more operators simultaneously while minimizing the impact on the network and associated services.
The deployment of Indirect Network Sharing is expected to be transparent to the users when they obtain network services in subscribed PLMN as other NG-RAN sharing configuration (e.g., MOCN). The involvement of the core network of the Hosting Operator e.g., for signaling exchange between the users and the core network of Participating Operator could cause exposure of the subscriber’s information to the hosting network.
Thus, extra scrutiny is expected to avoid sharing information not needed for the Indirect Network Sharing operation between the hosting operator and the participating operator. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.2 General | Table 7.2.2-1 General requirements
CPR #
Consolidated Potential Requirement
Original PR #
Comment
CPR 7.2.2-001
The 5G system shall be able to support Indirect Network Sharing between the Shared NG-RAN and one or more Participating NG-RAN Operators’ core networks, by means of the connection being routed through the Hosting NG-RAN Operator’s core network.
[PR 5.1.6-001]
Align with the definitions.
CPR 7.2.2-002
In case of Indirect Network Sharing, the 5G system shall be able to support means for a Participating NG-RAN Operator to provide its information on operator/network name to UEs, e.g., for display to the user.
[PR 5.1.6-002]
Align with the definitions.
Updated wording
CPR 7.2.2-003
In case of Indirect Network Sharing, UEs shall be able to access their subscribed PLMN services when accessing a Shared NG-RAN, where the subscribed PLMN is one of the Participating NG-RAN Operators.
[PR 5.2.6-004]
Align with the definitions.
Updated wording
CPR 7.2.2-004
Subject to the agreement between hosting operators and the participating operator, the 5G system shall support a mechanism to enable a UE to obtain its subscribed services, including Hosted Services, of participating operator via a Shared NG-RAN using Indirect Network Sharing.
NOTE: the requirement assumes no impact to UE.
[PR 5.5.6-001]
[PR 5.7.6-001]
Align with the definitions.
PRs merged. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.3 Mobility | Table 7.2.3-1 Consolidated Requirements to Mobility
CPR
Consolidated Potential Requirement
Original PR
Comment
CPR 7.2.3-001
In case of Indirect Network Sharing scenario and subject to hosting and participating operators’ policies, the 5G system shall support service continuity and minimize the impact on the user experience and service interruptions between different Shared NG-RANs and/or between a Shared NG-RAN and a non-Shared NG-RAN networks.
[PR 5.2.6-002]
[PR 5.3.6-001]
[PR 5.2.6-003]
PRs merged
CPR 7.2.3-002
In case of indirect network sharing, the 5G system shall enable the Shared NG-RAN of a hosting operator to provide services for inbound roaming users.
NOTE: Inbound roaming users may not have a roaming agreement with the hosting operator but only with a participating operator.
[PR 5.4.6-001]
PR modified |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.4 Network Access Control | Table 7.2.4-1 Consolidated Requirements on Network Access Control
CPR #
Consolidated Potential Requirement
Original PR #
Comment
CPR 7.2.4-001
In case of Indirect Network Sharing and subject to Hosting and Participating Operators’ policies, the 5G system shall support a mechanism to enable a UE with a subscription to a Participating Operator to select an appropriate radio access network (i.e., E-UTRAN, NG-RAN) and/or access an authorized Shared NG-RAN.
[PR 5.3.6-004]
[PR 5.3.6-003]
CPR 7.2.4-002
In case of Indirect Network Sharing and subject to Hosting and Participating Operators’ policies, the 5G system shall support access control for a UE accessing a Shared NG-RAN and be able to apply differentiated access control for different access networks of Shared NG-RANs when more than one Shared NG-RAN are available for the Participating Operator to choose from.
[PR 5.3.6-002]
[PR 5.7.6-003]
CPR 7.2.4-003
In case of Indirect Network Sharing, based on the Hosting and Participating Operators’ policies, the 5G system shall enable the Participating Operator to provide steering information in order to assist a UE with access network selection amongst the Hosting Operator’s available Shared RAN(s).
[PR 5.7.6-002]
NOTE: the above requirements assume no UE impacts. |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.5 Regulatory Services | Table 7.2.5-1 Consolidated Requirements to the Regulatory Services
CPR
Consolidated Potential Requirement
Original PR #
Comment
CPR 7.2.5-001
In case of Indirect Network Sharing and subject to regulatory requirements and mutual agreement between the participating operators and the hosting operator, the 5G system shall support regulatory services (e.g., PWS, emergency calls).
[PR 5.6.6-001]
[PR 5.8.6-001]
PRs merged
CPR 7.2.5-002
In Indirect Network Sharing, the 5G system shall be able to provide a UE accessing a Shared NG-RAN network with positioning service in compliance with regulatory requirements.
[PR 5.6.6-002]
PR modified
CPR 7.2.5-003
Subject to regulatory requirements and mutual agreement between the participating operators and the hosting operator the Shared NG-RAN operator shall be able to broadcast PWS messages originated from the core network of the hosting operator.
[PR 5.8.6-002]
PR modified |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 7.2.6 Charging | Table 7.2.6-1 Charging requirements
CPR #
Consolidated Potential Requirement
Original PR #
Comment
CPR 7.2.6-001
The 5G core network shall be able to support collection of charging information associated with a UE accessing a Shared NG-RAN using Indirect Network Sharing.
[PR 5.2.6-001]
Align with the definitions.
shared network
->Shared NG-RAN |
c3de68f0eebb105002ecbb9ed169af5e | 22.851 | 8 Conclusion and recommendations | This document analyzes a number of use cases to enhance the support of network sharing. Some considerations and the resulting potential consolidated requirements have been captured in clauses 6 and 7. It is recommended to proceed with normative work and include the consolidated requirements identified in order to better serve communication services. Annex A (informative): Change history Change history Date Meeting TDoc CR Rev Cat Subject/Comment New version 05/2022 SA1#98e S1-221092 Initial Skeleton 0.0.0 05/2022 SA1#98e Incorporation of approved pCRs: S1-221271; S1-221292; S1-221272 0.1.0 09/2022 SA1#99e Incorporation of approved pCRs: S1-222392; S1-222393; S1-222394; S1-222395; S1-222396 0.2.0 11/2022 SA1#100 Incorporation of approved pCRs: S1-223623; S1-223624; S1-223682; S1-223626; S1-223683; S1-223599 0.3.0 12/2022 SA#98e SP-221266 Raised to v.1.0.0 by MCC for presentation for information to SA#98e 1.0.0 02/2023 SA1#101 Incorporation of approved pCRs: S1-230579; S1-230746; S1-230361; S1-230580; S1-230067; S1-230781; S1-230782; S1-230583 1.1.0 05/2023 SA1#102 Incorporation of approved pCRs: S1-231523; S1-231182; S1-231138; S1-231525; S1-231506; S1-231526; S1-231723; S1-231527; S1-231522 1.2.0 06/2023 SA#100 SP-230508 MCC clean-up for approval by SA 2.0.0 06/2023 SA#100 SP-230508 Raised to v.19.0.0 by MCC following approval by SA 19.0.0 2023-09 SA#101 SP-231019 0001 1 F Updates on the TR 22.851 consolidation and consideration 19.1.0 2023-09 SA#101 SP-231019 0003 1 F Update Network Access Control CPR 19.1.0 |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 1 Scope | The present document investigates specific use cases and service requirements for 5GS support of enhanced XR-based services, (as XR-based services are an essential part of "Metaverse" services considered in this study,) as well as potentially other functionality, to offer shared and interactive user experience of local content and services, accessed either by users in the proximity or remotely. In particular, the following areas are studied:
- Support of interactive XR media shared among multiple users in a single location, including:
- performance (KPI) aspects; e.g. latency, throughput, connection density
- efficiency and scalability aspects, for large numbers of users in a single location.
- the combination of haptics type of XR media and other non-haptics types of XR media.
- Identification of users and other digital representations of entities interacting within the Metaverse service.
- Acquisition, use and exposure of local (physical and digital) information to enable Metaverse services, including:
- acquiring local spatial/environmental information and user/UE(s) information (including viewing angle, position and direction);
- exposing local acquired spatial, environmental and user/UE information to 3rd parties to enable Metaverse services.
- Other aspects, such as privacy, charging, public safety and security requirements.
The study also investigates gaps between the identified new potential requirements and the requirements already specified for the 5G system.
It is acknowledged that there are activities related to the topic Metaverse outside of 3GPP, such as the W3C Open Metaverse Interoperability Group (OMI). These activities may be considered in the form of use cases and related contributions to this study, but there is no specific objective for this study to consider or align with external standardization activities.
A difference between this study and TR 22.847 " Study on supporting tactile and multi-modality communication services" is that Metaverse services would involve coordination of input data from different devices/sensors from different users and coordination of output data to different devices at different destinations to support the same task or application. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 2 References | The following documents contain provisions which, through reference in this text, constitute provisions of the present document.
- References are either specific (identified by date of publication, edition number, version number, etc.) or non‑specific.
- For a specific reference, subsequent revisions do not apply.
- For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document.
[1] 3GPP TR 21.905: "Vocabulary for 3GPP Specifications".
[2] 3GPP TS 22.228: "Service requirements for the Internet Protocol (IP) Multimedia core network Subsystem (IMS)".
[3] 3GPP TS 22.173: "IP Multimedia Core Network Subsystem (IMS) Multimedia Telephony Service and supplementary services".
[4] 3GPP TS 22.101: "Service principles".
[5] 3GPP TS 22.261: "Service requirements for the 5G system".
[6] Talaba D, Horvath I, Lee K H, "Special issue of Computer-Aided Design on virtual and augmented reality technologies in product design," Computer-Aided Design 42:361 - 363.
[7] Wang X, Tsai J J-H, "Collaborative Design in Virtual Environments," In: ISCA 48:1-2.
[8] Benford S, Greenhalgh C, Rodden T, "Collaborative virtual environments." Communications of the ACM 44: 79–85.
[9] M. Eid, J. Cha, and A. El Saddik, "Admux: An adaptive multiplexer for haptic-audio-visual data communication", IEEE Tran. Instrument. and Measurement, vol. 60, pp. 21–31, Jan 2011.
[10] K. Iwata, Y. Ishibashi, N. Fukushima, and S. Sugawara, "QoE assessment in haptic media, sound, and video transmission: Effect of playout buffering control", Comput. Entertain., vol. 8, pp. 12:1–12:14, Dec 2010.
[11] N. Suzuki and S. Katsura, "Evaluation of QoS in haptic communication based on bilateral control", in IEEE Int. Conf. on Mechatronics (ICM), Feb 2013, pp. 886–891.
[12] E. Isomura, S. Tasaka, and T. Nunome, "A multidimensional QoE monitoring system for audiovisual and haptic interactive IP communications", in IEEE Consumer Communications and Networking Conference (CCNC), Jan 2013, pp. 196–202.
[13] A. Hamam and A. El Saddik, "Toward a mathematical model for quality of experience evaluation of haptic applications", IEEE Tran. Instrument. and Measurement, vol. 62, pp. 3315–3322, Dec 2013.
[14] O. Holland et al., "The IEEE 1918.1 "Tactile Internet" Standards Working Group and its Standards," Proceedings of the IEEE, vol. 107, no. 2, Feb. 2019.
[15] Lee, Donghwan et. al., "Large-scale Localization Datasets in Crowded Indoor Spaces, " NAVER LABS, NAVER LABS Europe, CVPR 2021.
[16] 3GPP TR 22.837: " Study on Integrated Sensing and Communication."
[17] Alriksson, F., Kang, D.H., Phillips, C., Pradas, J.L. and Zaidi, A., 2021. XR and 5G: Extended reality at scale with time-critical communication. Ericsson Technology Review.
[18] 3GPP TR 26.928: "Extended Reality (XR) in 5G."
[19] Ge, Y., Wen, F., Kim, H., Zhu, M., Jiang, F., Kim, S., Svensson, L. and Wymeersch, H., 2020. 5G SLAM using the clustering and assignment approach with diffuse multipath. Sensors, 20(16), p.4656.
[20] Kim, H., Granström, K., Gao, L., Battistelli, G., Kim, S. and Wymeersch, H., 2020. 5G mmWave cooperative positioning and mapping using multi-model PHD filter and map fusion. IEEE Transactions on Wireless Communications, 19(6), pp.3782-3795.
[21] Liu, A., Huang, Z., Li, M., Wan, Y., Li, W., Han, T.X., Liu, C., Du, R., Tan, D.K.P., Lu, J. and Shen, Y., 2022. A survey on fundamental limits of integrated sensing and communication. IEEE Communications Surveys & Tutorials, 24(2), pp.994-1034.
[22] Dwivedi, S., Shreevastav, R., Munier, F., Nygren, J., Siomina, I., Lyazidi, Y., Shrestha, D., Lindmark, G., Ernström, P., Stare, E. and Razavi, S.M., 2021. Positioning in 5G networks. IEEE Communications Magazine, 59(11), pp.38-44.
[23] Suhr, C. "S.L.A.M. and Optical Tracking for XR", https://medium.com/desn325-emergentdesign/s-l-a-m-and-optical-tracking-for-xr-cfabb7dd536f <accessed 1.9.22>
[24] Kim, H., Granstrom, K., Svensson, L., Kim, S. and Wymeersch, H., 2022. PMBM-based SLAM Filters in 5G mmWave Vehicular Networks. IEEE Transactions on Vehicular Technology.
[25] 5GAA "C-V2X Use Cases Volume II: Examples and Service Level Requirements", 5G Automobile Assocaition White Paper, https://5gaa.org/wp-content/uploads/2020/10/5GAA_White-Paper_C-V2X-Use-Cases-Volume-II.pdf <accessed 02.09.22>
[26] A. Ebrahimzadeh, M. Maier and R. H. Glitho, "Trace-Driven Haptic Traffic Characterization for Tactile Internet Performance Evaluation," 2021 International Conference on Engineering and Emerging Technologies (ICEET), 2021, pp. 1-6.
[27] Lee, L.-H., Braud, T., Zhou, P., Wang, L., Xu, D., Lin, Z., Kumar, A., Bermejo, C., and Hui, P. All One Needs to Know about Metaverse: A Complete Survey on Technological Singularity, Virtual Ecosystem, and Research Agenda, 2021.
[28] Halbhuber, David & Henze, Niels & Schwind, Valentin. (2021). Increasing Player Performance and Game Experience in High Latency Systems. Proceedings of the ACM on Human-Computer Interaction. 5. 1-20. 10.1145/3474710.
[29] ITU-T Recommendation Y.3090 (02/22): "Digital twin network - Requirements and architecture" (https://www.itu.int/rec/T-REC-Y.3090-202202-I).
[30] Skalidis, I., Muller, O. and Fournier, S., 2022. CardioVerse: The Cardiovascular Medicine in the Era of Metaverse. Trends in Cardiovascular Medicine.
[31] Koo, H., 2021. Training in lung cancer surgery through the metaverse, including extended reality, in the smart operating room of Seoul National University Bundang Hospital, Korea. Journal of educational evaluation for health professions, 18.
[32] Mozumder, M.A.I., Sheeraz, M.M., Athar, A., Aich, S. and Kim, H.C., 2022, February. Overview: technology roadmap of the future trend of metaverse based on IoT, blockchain, AI technique, and medical domain metaverse activity. In 2022 24th International Conference on Advanced Communication Technology (ICACT) (pp. 256-261). IEEE.
[33] Ning, H., Wang, H., Lin, Y., Wang, W., Dhelim, S., Farha, F., Ding, J. and Daneshmand, M., 2021. A Survey on Metaverse: the State-of-the-art, Technologies, Applications, and Challenges. arXiv preprint arXiv:2111.09673.
[34] https://www.xrtoday.com/virtual-reality/ukrainian-doctors-perform-first-vr-surgery/
[35] https://www.cryptotimes.io/surgeon-performed-surgery-remotely-through-metaverse/
[36] Senk, S., Ulbricht, M., Tsokalo, I., Rischke, J., Li, S.C., Speidel, S., Nguyen, G.T., Seeling, P. and Fitzek, F.H., 2022. Healing Hands: The Tactile Internet in Future Tele-Healthcare. Sensors, 22(4), p.1404.
[37] Chen, D. and Zhang, R., 2022. Exploring Research Trends of Emerging Technologies in Health Metaverse: A Bibliometric Analysis. Available at SSRN 3998068.
[38] https://www.yankodesign.com/2022/05/15/the-metaverse-has-the-power-to-improve-healthcare-and-it-has-already-begun/
[39] 3GPP TS 23.501: "System architecture for the 5G System (5GS)".
[40] 3GPP TS 38.300: "NR; NR and NG-RAN Overall description; Stage-2".
[41] 3GPP TS 22.826: "Study on communication services for critical medical applications".
[42] Tataria, H., Shafi, M., Molisch, A.F., Dohler, M., Sjöland, H. and Tufvesson, F., 2021. 6G wireless systems: Vision, requirements, challenges, insights, and opportunities. Proceedings of the IEEE, 109(7), pp.1166-1199.
[43] 3GPP TS 23.682: "Architecture enhancements to facilitate communications with packet data networks and applications".
[44] 3GPP TS 23.273: "5G System (5GS) Location Services (LCS); Stage 2".
[45] "MPEG-4 Face and Body Animation", https://visagetechnologies.com/mpeg-4-face-and-body-animation/ (accessed 2.11.22).
[46] 3GPP TS 22.105: "Services and service capabilities".
[47] Virtual humans: https://en.wikipedia.org/wiki/Virtual_humans
[48] Kwang Soon Kim, et al., "Ultrareliable and Low-Latency Communication Techniques for Tactile Internet Services", PROCEEDINGS OF THE IEEE, Vol. 107, No. 2, February 2019
[49] I.-J. Hirsh and C. E. J. Sherrick, "Perceived order in different sense modalities, " Journal of Experimental Psychology, vol. 62, no. 5, pp. 423–432, 1961.
[50] VESA Compression Codecs - vesa.org/vesa-display-compression-codecs
[51] 3GPP TR 23.700-80: "Study on 5G System Support for AI/ML-based Services".
[52] EU data protection rules | European Commission (europa.eu) (https://ec.europa.eu/info/law/law-topic/data-protection/eu-data-protection-rules_en)
[53] California Consumer Privacy Act (CCPA) | State of California - Department of Justice - Office of the Attorney General (https://oag.ca.gov/privacy/ccpa)
[54] Y. Sun, Z. Chen, M. Tao, and H. Liu, "Communications, caching, and computing for mobile virtual reality: Modelingand trade off, " IEEE Trans. Commun., vol. 67, no. 11, pp. 7573–7586, Nov. 2019.
[55] Y. Cai, J. Llorca, A. M. Tulino and A. F. Molisch, "Compute- and Data-Intensive Networks: The Key to the Metaverse," 2022 1st International Conference on 6G Networking (6GNet), 2022.
[56] 3GPP TS 22.226: "Global Text Telephony"
[57] ITU-T SG16 "ITU-T SG 16 Work on Accessibility - How ITU is Pioneering Telecom Accessibility for All", https://www.itu.int/en/ITU-T/studygroups/com16/accessibility/Pages/telecom.aspx, accessed 30.1.23.
[58] Ankit Ojha, Ayush Pandey, Shubham Maurya, Abhishek Thakur, Dr. Dayananda P, "Sign Language to Text and Speech Translation in Real Time Using Convolutional Neural Network," INTERNATIONAL JOURNAL OF ENGINEERING RESEARCH & TECHNOLOGY (IJERT) NCAIT – 2020 (Volume 8 – Issue 15).
[59] R. Kazantsev and D. Vatolin, "Power Consumption of Video-Decoders on Various Android Devices," 2021 Picture Coding Symposium (PCS), Bristol, United Kingdom, 2021, pp. 1-5, doi: 10.1109/PCS50896.2021.9477481.
[60] glTF 2.0 specification, https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html, accessed 02/02/2023..
[61] B. Egger, W. A. P. Smith, A. Tewari, S. Wuhrer, M. Zollhoefer, T. Beeler, F. Bernard, T. Bolkart, A. Kortylewski, S. Romdhani, C. Theobalt, V. Blanz, and T. Vetter. "3D Morphable Face Models—Past, Present, and Future". ACM Trans. Graph. 39, 5, Article 157 (Jun 2020)
[62] M. Zollhöfer, J. Thies, P. Garrido, D. Bradley, T. Beeler, P. Pérez, M. Stamminger, M. Nießner, and C. Theobalt. "State of the Art on Monocular 3D Face Reconstruction, Tracking, and Applications". Computer Graphics Forum (2018).
[63] 3GPP TS 23.503, " Policy and charging control framework for the 5G System (5GS); Stage 2". |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 3 Definitions of terms, symbols and abbreviations | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 3.1 Terms | For the purposes of the present document, the terms given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1].
avatar: a digital representation specific to media that encodes facial (possibly body) position, motions and expressions of a person or some software generated entity.
Conference: An IP multimedia session with two or more participants. Each conference has a "conference focus". A conference can be uniquely identified by a user. Examples for a conference could be a Telepresence or a multimedia game, in which the conference focus is located in a game server.
NOTE 1: This definition was taken from TS 22.228 [2].
Conference Focus: The conference focus is an entity which has abilities to host conferences including their creation, maintenance, and manipulation of the media. A conference focus implements the conference policy (e.g. rules for talk burst control, assign priorities and participant’s rights).
NOTE 2: This definition was taken from TS 22.228 [2].
digital asset: digitally stored information that is uniquely identifiable and can be used to realize value according to their licensing conditions and applicable regulations. Examples of digital assets include digital image (avatar), software licenses, gift certificates and files (e.g. music files) that have been purchased under a license that allows resale.
digital representation: the mobile metaverse media associated with the presentation of a particular virtual or physical object. The digital representation could present the current state of the object. One example of a digital representation is an avatar, see Annex A.
digital twin: A real-time representation of physical assets in a digital world.
NOTE 3: This definition was taken from ITU-T Recommendation Y.3090 [29].
gesture: a change in the pose that is considered significant, i.e. as a discriminated interaction with a mobile metaverse service.
immersive: a characteristic of a service experience or AR/MR/VR media, seeming to surround the user, so that they feel completely involved.
localization: A known location in 3 dimensional space, including an orientation, e.g. defined as pitch, yaw and roll.
location related service experience: user interaction and information provided by a service to a user that is relevant to the physical location in which the user accesses the service.
location agnostic service experience: user interaction and information provided by a service to a user that has little or no relation to the physical location in which the user accesses the service. Rather the service provides interaction and information concerning either a distant or a non-existent physical location.
mobile metaverse media: media communicated or enabled using the 5G system including audio, video, XR (including haptic) media, and data from which media can be constructed (e.g. a 'point cloud' that could be used to generate XR media.)
mobile metaverse: the user experience enabled by the 5G system of interactive and/or immersive XR media, including haptic media.
mobile metaverse server: an application server that supports one or more mobile metaverse services to a user access by means of the 5G system.
mobile metaverse service: the service that provides a mobile metaverse experience to a user by means of the 5G system.
pose: the relative location, orientation and direction of the parts of a whole. The pose can refer the user, specifically used in terms of identifying the position of a user's body. The pose can also also refer to an entity or object (whose parts can adopt different locations, orientations, etc.) that the user interacts with by means of mobile metaverse services.
predictive digital representation model: a model used for creating a digital representation (e.g. avatar) of a user or object in AR/MR/VR communication that helps to compensate for communication latency and/or conceal negative effects of high communication latency between the users by predicting for example events, changes or outcomes that impact the digital representation, such as predicting the current position or pose of a remote user.
service information: this information is out of scope of standardization but could contain, e.g. a URL, media data, media access information, etc. This information is used by an application to access a service.
spatial anchor: an association between a location in space (three dimensions) and service information that can be used to identify and access services, e.g. information to access AR media content.
spatial map: A collection of information that corresponds to space, including information gathered from sensors concerning characteristics of the forms in that space, especially appearance information.
spatial mapping service: A service offered by a mobile network operator that gathers sensor data in order to create and maintain a Spatial Map that can be used to offer customers Spatial Localization Service.
spatial localization service: A service offered by a mobile network operator that can provide customers with Localization.
User Identifier: a piece of information used to identify one specific User Identity in one or more systems.
NOTE 4: This definition was taken from TS 22.101 [4].
User Identity: information representing a user in a specific context. A user can have several user identities, e.g. a User Identity in the context of his profession, or a private User Identity for some aspects of private life.
NOTE 5: This definition was taken from TS 22.101 [4].
User Identity Profile: A collection of information associated with the User Identities of a user.
NOTE 6: This definition was taken from TS 22.101 [4].
digital wallet: one type of digital asset container, also known as an e-wallet or mobile wallet, is a software application that securely stores digital credentials, such as payment information, loyalty cards, tickets, and other digital assets. It allows users to make electronic transactions, such as payments and transfers, conveniently and securely using their digital credentials.
NOTE 7: Digital wallets typically employ encryption and authentication mechanisms to protect the stored information and ensure the security of transactions. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 3.2 Symbols | Void. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 3.3 Abbreviations | For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1].
AI Artificial Intelligence
CCTV ClosedCircuit TeleVision
DoF Degrees of Freedom
DVE Distributed Virtual Environment
FACS Facial Action Coding System
FOV Field Of View
LiDAR Light Detection And Ranging
VRU Vulnerable Road User |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 4 Overview | Mobile metaverse services are discussed in this technical report both in the abstract and concrete. Specific services mentioned in the TR include:
- Situational awareness for drivers, pedestrians, cyclists, to increase safety and efficiency of transport (see 5.2).
- XR enabled collaborative and concurrent engineering, to enable local and remote collaboration (see 5.3).
- Participatory in and passive observation of virtual reality events, e.g. basketball (see 5.6).
- Presentation of AR content, e.g. a feature length movie, on a virtual screen (see 5.7).
- Remote critical health care, including surgery and treatment (see 5.10).
The study also considers a number of use cases that feature new service enablers, including:
- Providing users with information and services that are of local relevance (see 5.1).
- Enhancements to IMS to support multiple users and multi-modal XR communication (see 5.3).
- Support for spatial anchors to link service information to specific locations (see 5.4).
- Support for spatial localization and mapping services, and enablers for them in the 5GS (see 5.5).
- Support for multi-service coordination for different input and output devices and diverse services (see 5.8).
- Support for synchronization of different data streams and predicted network conditions (especially latency) to enable immersive remote collaboration despite significant distance and therefore communication delay between participants (see 5.9). |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5 Use Cases | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1 Use Case on Localized Mobile Metaverse Service | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.1 Description | This use case will consider the potential service opportunities that arise when advanced location information is available to trigger AR based services.
A precursor to this use case is briefly considered: i-mode service, introduced by NTT DOCOMO in 1999. The discussion of i-mode serves as an inspiration. This service was extremely successful, was one of the early mobile services beyond messaging and voice, and has many potential similarities with metaverse services. This service in many ways preceded and foresaw mobile internet services that would arise 10 years later. Users could access data on-line concerning weather, traffic, etc. While there were many revolutionary aspects to i-mode, three are particularly relevant for this use case:
- i-area – a location information service that enabled the user to identify locally relevant information – concerning traffic, maps and retail store information for business in the user’s proximity.
- i-channel – a distribution service of ‘latest information’ that the user could further investigate (through interaction) and whose display was user configurable.
- a fully decentralized content and service creation framework allowing third parties to easily provide content, especially relevant: location specific content. This made it possible for small businesses to provide information to potential customers in the proximity such as opening hours, special offers, etc. It was even possible for those in the same location to meet and join 'virtually,' e.g. to play a computer game with other passengers in the same train car or bus.
In this use case, analogously, services that are locally relevant can be accessed, with relevant information retried. The user will have the ability to selectively control which of this information to display. The content that is obtained comes from decentralized sources - in this case different individual merchants will provide this information.
The use case described in this clause does not rely on or recreate the i-mode service. Rather, some of the ideas in i-mode service are carried forward given the new opportunities enabled by localized 3D interactive media. Localized mobile metaverse services creates exciting new opportunities to receive locally relevant content and interact with services.
These capabilities taken to a much greater level will form the essence of the coming mobile metaverse user experience. What distinguishes this service most is that it will provide the user with information services integrated into their ordinary experience. Consider an example of a commuter navigating an underground passage. Diverse relevant information is integrated into the user's field of view, as shown in Figure 5.1.1-1 below.
Figure 5.1.1-1: Localized Mobile Metaverse Services offering relevant information
Here, the AR annotation provides much more than an augmented map. The user is going to catch a train. (a) The path to the platform is shown without obstructing the user's perception of their proximity, where the contrast is good and no distractions appear. The (b) store on the right can provide content that may be relevant to the potential shopper, here the store's opening hours. Further along, (c) a restaurant provides a personalized message, reminding the user that they ate there in the past and ordered soup. These services are linked to the space that the user is in. See Figure 5.1.1-2, below.
Figure 5.1.1-2: Services offering relevant information are anchored in space
The three information augmentations that are displayed are the result of different source of information. The path information (a) can be presented anywhere that it fits into the scene, where the information for (b) and (c) are anchored in space. The information depends on interaction between the user (or the user's preferences, application context, etc.): the content shown depends on the user's interest: (a) they are travelling, and the navigation app knows what information the user needs to see. (b) The user has sought Blue Lotion and it is available here - the user's 'persistent search' shows local results. (c) At this time of day, the user often eats, so the restaurant on the left's reminder is welcome.
The 5G system enables this 'access to local services' in a number of ways. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.2 Pre-conditions | Ulysses uses his AR capable glasses while travelling. These are tethered to a UE that he carries. This UE receives 5G service from the mobile network operator he has a subscription with, M. Ulysses is interested in using his AR capable glasses to receive relevant information, so he has selected which applications are 'relevant.' These applications are therefore configured with the operator M to be 'active.' The purpose of this configuration will be described below.
Local services, that is services that are localized, have anchors that can be relevant to applications. A pre-condition of this use case is that such services exist and have spatially defined access. This is considered in this use case as an 'anchor.' The local service provider associates a service with an anchor, potentially as well as metadata concerning the service (e.g. 'is a restaurant'.) This information is available to M, either because the local services have been registered with M directly, or are available in maps, registries, or other information sources that M has access to (e.g. information that is provided by the UE). |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.3 Service Flows | Ulysses transfers at the Osaka train station. He has some time before he catches his connecting train. As he is hungry, he activates a 'persistent search' on his mobile device to inform him of opportunities to eat as he traverses the station. His mobile device begins to collect information about his surroundings. The collected information can include information that is obtained by interacting with nearby devices (e.g. sensors and other mobile devices). He also has a shopping list (a set of items of interest) from retail stores. He makes use of a navigation facility so that he will neither lose his way nor lose track of time.
This has the following consequences:
a) As a result of the applications activated by the user, and the user's preferences, the UE requests the 'localized mobile metaverse' service enabler offered by M, providing a list of 'persistent search' information and the information that was collected from the user’s surroundings.
b) M receiving the persistent search and the information that was collected from the user’s surroundings, engages localized service activation. The location of the UE is compared against the search criteria, the information that was collected from the user’s surroundings and information available to M of spatially defined access points.
c) M identifies a match - essentially a 'JOIN' of user preferences / application persistent search criteria "restaurants" AND location (in the user's field of view) AND local spatially defined access exists.
d) This match is provided to the UE, for further processing by the application.
Example 1: The application [service] associated with 'restaurants' has stored information the user has eaten there and indicated it was 'good.' Thus the annotation 'Soup you liked last time' is displayed.
Example 2: The application [service] associated with 'shopping' has stored a shopping list. When a local 'shopping service access point' has been identified, the shopping application queries the service provider directly. If there is success, the result is displayed. Here: "Blue Lotion you want ¥2000."
e) The UE is able to interact with the application to obtain information associated with the service. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.4 Post-conditions | Spatial information has successfully been employed to allow a user to identify services. Please see 5.5 for further information on how spatial information is obtained.
The user's location has been applied.
The information output of different applications are integrated into the media that the user sees through the AR display device. The information associated with the services is displayed in the proper location in the user's field of view.
Ulysses may choose to eat soup or buy Blue Lotion.
M may charge Ulysses for this service, e.g. for the use of a persistent search and for each successful result provided. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.5 Existing feature partially or fully covering use case functionality | Location based services exist, to identify the position of the UE.
The 5GS supports means to expose the UE's location to a 3rd party service provider.
Location Services for CIoT in 23.682, 4.5.19 [43], allow an LCS client to obtain the location of a CIoT UE location, either periodically or in a triggered manner, as well as the last known location of a UE that is unreachable for an extended time.
Location services, as specified in 23.273, 5.5 [44] support exposure to authorized third parties through a CAPIF API between NEF and the AF. The information that can be exposed includes the target UE identity and parameters for location events, e.g. the trigger (the UE enters, leaves or is within a target area), the time between reports if multiple reports are requested, and of course the precise location in three dimensions up to the maximum horizontal and vertical accuracy supported.
Effectively, there are also numerous 'over the top' mechanisms that essentially communicate GPS or other location information that can be accessed by the application from the mobile OS and terminal equipment. The client application on the UE supplies this location information to an AS 'opaque' to the 5G system (that is, in application traffic that is not visible to the 5G system.) In this sense, the 5G system supports location services in that it provides a UE with the mobile broadband services.
Using the location information obtained by any of these three mechanisms described above, a service provider can identify appropriate content for or interaction with the user, by means that are out of scope of 3GPP. This could include the services introduced in 5.1.1 in the example of i-mode. The application service can be configured or used in such a way that the user's interest is known, and location-relevant information can be 'pushed' to the client application.
There is a significant gap between this support and the functionality described in this use case. The Localized Mobile Metaverse Services unlike e.g. i-mode and similar location service enabled applications does not assume the use of a specific mediating application that organizes and delivers information. Rather, this use case features and motivates a service enabler that allows the user to discover different locally relevant services and content, so that any available application service can be accessed by the UE, e.g. through a web browser or other interactive-media capable general purpose application. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.1.6 Potential New Requirements | [PR 5.1.6-1] The 5G system shall enable third parties to make known the availability of application services (i.e. provided by Application Servers) associated with a precise location.
[PR 5.1.6-2] The 5G system shall provide suitable exposure mechanisms for application services (i.e. provided by Application Servers) associated with a precise location available in the user's proximity (e.g. within line of sight), such that the mobile metaverse services can conform to specific service constraints.
NOTE: The term 'service constraints' implies that certain targets of service discovery are supported, e.g. to find 'restaurants.'
[PR 5.1.6-3] The 5G system shall provide suitable means to discover mobile metaverse services (i.e. provided by mobile metaverse servers) associated with a precise location available in the user's proximity.
[PR 5.1.6-4] The 5G system shall enable the rendering of diverse media, from one or more mobile metaverse services associated with a single location, to be combined to form a single location related service experience. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2 Use Case on Mobile Metaverse for 5G-enabled Traffic Flow Simulation and Situational Awareness | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.1 Description | Smart transport is a very important area for 5G system as well as an important mobile metaverse service. To reduce traffic jam and minimize traffic accident, 5G, including cellular based V2X technologies becomes more and more essential. The 5G system can be utilized to support real-time information & data delivery for the traffic participants including pedestrians, bicycle riders, and vehicles with or without autonomous driving mode. As shown in Figure 5.2.1-1, the physical objects including road infrastructure and vehicles including cars and trucks in each lanes, will have a corresponding digital twin in the virtual world, and the virtual and physical objects form the mobile metaverse. In this use case, there are virtual objects which actually represent the physical objects including vehicle, road and also pedestrians. This is the basis to enable smart transport applications like traffic flow simulation and situational awareness.
Figure 5.2.1-1 Example of Smart Transport Metaverse
With the support of 5GS, real-time information and data about the physical objects can be delivered and the virtual objects of the road infrastructure and traffic participants including vulnerable road users can form a smart transport mobile metaverse service as shown in Figure 5.2.1-2. Then real-time processing& computing can be conducted to support traffic simulation, situational awareness, and real time path guidance. Real-time safety or security alerts can be generated for vehicles as well as the driver and passengers.
Figure 5.2.1-2 Scenario of 5G-enabled Traffic Flow Simulation and Situational Awareness
In order to support traffic flow simulation and situational awareness service, the 5G network need to provide low latency, high data rate and high reliability transmission, handover procedures that minimize service disruptions, and in addition, the 5G network may also need to be further enhanced to meet the service requirements for 5G-enabled traffic flow simulation and situation awareness. Meanwhile, in addition to the physical objects which may use UEs for telecommunication services, their corresponding virtual objects are also capable of interacting with each other and also interact with physical objects via 5GS.
This use case employs terminology defined in Table 5.2.1-1.
Table 5.2.1-1: Traffic Flow Simulation and Situational Awareness Terminology
Situational Awareness
The ability to know and understand what is going on in the surroundings, e.g. the perception of environmental elements and events |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.2 Pre-conditions | 1. Traffic participants may or may not be equipped with 5G-enabled terminal equipment. The 5G-enabled terminal equipment can send and receive data via 5G network.
2. Computing and storage resources are provided for the mobile metaverse servers deployed locally or over the data network (in a centralized cloud) to allow real-time processing of huge amount of data produced by human, vehicle, camera, radar etc.
3. Wired or wireless communication resources are configured among the road infrastructure and mobile metaverse server so that the server has real time information of traffic participants perceived by these sensors. The road infrastructure include camera, radar/lidar and also other devices, e.g. for traffic control and guidance etc. The infrastructure equipment may have wired connection e.g. fiber or ethernet or any other wired connection if available. The cellular wireless network can be used to provide a more flexible way to obtain communication services, especially when a wired connection is not available. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.3 Service Flows | 1. Wired or wireless communication path are configured among the road infrastructure and mobile metaverse server so that the server has real time information of traffic participants perceived by these sensors.
2. Sensors deployed in roadside are initialized and enter normal working mode which means the sensors can start to capture the traffic participants.
3. Data connections between the vehicle driver’s UE and server are established with 5G UE module being registered with the server. Vehicles without driver are equipped with a 5G UE module.
4. The traffic participants including pedestrians, bicycle riders, and vehicles, send real-time information towards the server by means of 5G UEs. The real-time status information includes position, speed, direction, brake status etc. Other related status information can be, but is not limited to, sensors for XR (haptic, audio, video, etc) or smart transport (traffic light, camera, radar/LiDAR, etc). These traffic participants are physical objects in physical world which send their property and status information to the corresponding digital-twins objects, that is, virtual objects. This real-time information may include structured and unstructured data. Structured data normally means data which has been processed and thus formatted in a certain way, can be easily stored in database and when transmitted via 5G wireless network, normally less transmission resource (e.g. lower data rate) is needed. Unstructured data are not formatted in a certain and pre-defined way and is not easy to store in database and when transmitting over 5G network, more transmission resource would be needed.
5. Within the mobile metaverse service, for 5G-enabled traffic flow simulation and situation awareness, real-time information of the physical objects including the road infrastructure, the traffic participants as well as other information from traffic light signal, camera, radar, etc, are synchronized to the virtual objects and real-time simulation are conducted. The virtual objects can also interact with each other within virtual world and interact with physical objects via 5G system.
Having interaction among virtual and physical objects is essential and beneficial to support traffic simulation and situational awareness as the virtual objects can also have their intelligence which allow them to request telemetry from physical objects or other virtual objects. Thus, not only the physical but also virtual objects need to interact via 5G system and they need to be identified. Interaction means the physical objects deliver real-time status data to digital twin i.e. virtual objects in one direction, and the virtual objects can also send alert or other information to physical objects.
Such interaction needs to be realized via 5GS because the physical objects like vehicles or humans need to use a 5G UE to access the 5G system and use 5G resources to establish a connection with 5G system QoS support. Identification of these physical and virtual objects by 5GS and thus associating them is needed to support such interaction, meanwhile, from 5G system perspective, this make it easier for the 5G system, based on operator policy with 3rd party, to provide bearer services with corresponding QoS guarantee for both physical and virtual objects as they can be seen as UE for 5GS.
It is noted that physical objects may include static and dynamic. Some static objects deployed along road side like light poles may not move but their properties can be synchronized with their virtual objects if such properties may impact traffic simulation and situational awareness and also visualization processing of the physical world. Theses physical objects may or may not have dedicated UEs. Pedestrian and vehicles are equipped with UEs but the sensors like camera, LiDAR and radar may share a common communication module to establish connection with mobile metaverse services. The communication module can be wired node or wireless UE.
6. Traffic flow simulation is conducted which can predict whether there will be traffic jam and which path is optimal for a certain vehicle. The simulation can generate traffic assistance or guidance in a real-time manner towards the traffic participants.
7. The mobile metaverse server sends the traffic assistance or guidance information to the UE which serves the pedestrian, bicycle rider, vehicle driver or autonomous vehicle. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.4 Post-conditions | 1. The mobile metaverse server conducts big data analysis to further refine the accuracy of traffic flow simulation and situation awareness.
2. Both UEs serving (e.g. providing sensing data for) the physical objects and digital twins objects can be identified by the 5G system. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.5 Existing features partly or fully covering the use case functionality | The QoS framework of 5G system supports low latency, high reliability or high data rate transmission of data traffic. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.2.6 Potential New Requirements needed to support the use case | [PR 5.2.6-1] The 5G system shall be able to support the following KPIs for transmission for traffic between a large number of UEs and application server (e.g. mobile metaverse server).
Use Cases
Characteristic parameter (KPI)
Influence quantity
Max allowed end-to-end latency
Service bit rate: user-experienced data rate
Reliability
Area Traffic capacity
Message Data Volume
Transfer Interval
Service Area
5G-enabled Traffic Flow Simulation and Situational Awareness
[5-20] ms (NOTE 1)
[10~100 Mbit/s] [25](NOTE 6)
> 99.9%
[~39.6 Tbit/s/km2 ]
(NOTE 5)
Typical data:
Camera: 10M bits/s per sensor (unstructured)
LiDAR: 90M bits/s per sensor (unstructured)
Radar: 10M per sensor (unstructured)
Real-time Status information including Telemetry data:
[< 50K bits/s] per sensor/vehicle/VRU
(structured)
(NOTE 2)
20-100 ms
(NOTE 3)
City or Country wide
(NOTE 4)
NOTE 1: The mobile metaverse server receives the data from various sensors, performs data processing, rendering and provide feedback to the vehicles and user s. The end-to-end latency refers to the transmission delay between a UE and the mobile metaverse server. Exact value is FFS
NOTE 2: To support at least 80 vehicles and 1600 users present at the same location (e.g. in an area of 40m*250m) to actively enjoy immersive metaverse services for traffic simulation and traffic awareness, the area traffic capacity is calculated considering 2 cameras, 2 Radars, 2 LiDARs on road side, 1600 user’s smart phones and 80 vehicles with 7 cameras, 4 radar and 2 LiDAR for each vehicle. These application layer message data need to be segmented for network transport thus doesn’t mean packet size. The real-time status information including telemetry data may be structured.
NOTE 3: The frequency considers different sensor types such as Radar/LiDAR (10Hz) and Camera (10~50Hz).
NOTE 4: The service area for traffic flow simulation and situational awareness depends on the actual deployment, for example, it can be deployed for a city or a district within a city or even countrywide. In some cases a local approach (e.g. the application servers are hosted at the network edge) is preferred in order to satisfy the requirements of low latency and high reliability.
NOTE 5: The calculation is this table is done per one 5G network, in case of N 5G networks to be involved for such use case in the same area, this value can be divided by N. Exact value is FFS.
NOTE 6: User experienced data rate refers to the data rate needed for the vehicle or human, the value is observed from industrial practice and exact value is FFS. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3 Use Case on collaborative and concurrent engineering in product design using metaverse services | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.1 Description | Since the industrial age, engineering design has become an extremely demanding activity. Collaborative and concurrent engineering occur as a concept and methodology at the end of the last century and was defined as a systematic approach to integrated and co-design of products and their related processes. The diversity and complexity of actual products, requires collaboration of engineers from different geographic locations to share the ideas and solutions with customer and to evaluate products development. VR and AR technologies have found their ways into critical applications in industrial sectors such as aerospace engineering, automotive engineering, medical engineering, and also in the fields of education and entertainment. The range of technologies include Cave Automatic Virtual Environment (better known by the recursive acronym CAVE) environments, reality theatres, power walls, holographic workbenches, individual immersive systems, head mounted displays, tactile sensing interfaces, haptic feedback devices, multi-sensational devices, speech interfaces, and mixed reality systems [6].
Figure 5.3.1-1: XR enabled collaborative and concurrent engineering in product design
(Source: https://vrtech.wiki/)
One of the key challenges is to how to enable a distributed virtual environment (DVE) allowing multiple users from different geographical locations (some of them are present at the same location) to interact over a network. A DVEs is defined as multi-user virtual reality that actively support communication, collaboration, and coordination [7], 3D place-like environment in which participants are provided with graphical embodiments called avatars that convey their identity, presence, location, and activities to others [8]. A DVE is the simultaneous existence of multiple users in the same virtual space represented as avatars, their communication, the shared exploration of 3D visualizations, and the collaborative construction of new content. This avatar representation is essential for every user knows about the actual perceptions of other users. The users can communicate with each other. They can interact with other users and with the virtual environment. A DVE in the terms of this study is a location agnostic service experience.
To support DVEs for collaborative and concurrent engineering, the 5G system needs to fulfil certain KPIs, such as latency, data rate, reliability. Moreover the 5G system (with mobile metaverse services) is expected to support the fundamental features including:
- mobile metaverse media support among multiple users;
- User Identity management;
- data security. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.2 Pre-conditions | Novitas, an innovative start-up company, has set up a distributed virtual environment (with the corresponding 5G communication subscriptions provided by GreenMobile) for collaborative and concurrent engineering in their product design with engineers participating locally and remotely. They have been granted a contract to work together with several partner companies to design and produce a new model of aeroplane engine.
In the current phase, Novitas need to collaborate closely with Nyhet, a partner company, to design the key parts of the engine. As part of the agreement, they use the distributed virtual environment to carry out some of the design that requires interaction among engineers. Some engineers use mobile phones or computers (as well as the necessary XR devices), with the corresponding 5G communication subscriptions, to attend such engineering meetings. To protect the sensitive business information, strict security requirements for user identity management and data security are crucial.
The service flows below illustrate how engineers interact with each other using services provided by 5G system.
Figure 5.3.2-1: Illustration of Collaborative Workspace (Source: ESI-Icido GmbH) |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.3 Service Flows | 1. Archimedes, Isambard, Leonardo and George have scheduled an XR design meeting, and Archimedes, Isambard, Leonardo attend from offices while George attends from the factory. Each participant needs to be authenticated before being admitted to the meeting. Due to the strict security requirements, typically participants need to be authenticated using bio information (e.g. finger print, facial image) at the terminal side. The result (not the original bio information) of the terminal side authentication can be forwarded to the corresponding application server of the enterprise. The final result of the network level authentication (can also include the context information, e.g. location information of the participants) is also forwarded to the corresponding application server of the enterprise. Such information helps the enterprise to decide what information (e.g. levels of confidentiality) can be disclosed to which participants during the meeting.
2. Having completed the authentication of the participants, the multimedia communication session/sessions are set up among multiple users as well as the associated devices in the mixed reality systems (e.g. head mounted displays, tactile sensing interfaces, haptic feedback devices, multi-sensational devices). This can be done by means of the IMS (including IMS CN with Data Channel capability) or via OTT applications.
3. When a session starts, multiple streams are established over the 5G network between the corresponding devices that carry multiple modalities data. Table 5.3.3-1 depicts the typical QoS requirements that have to be fulfilled in order for the users’ QoE to be satisfactory.
Table 5.3.3-1 Typical QoS requirements for multi-modal streams [9] [10] [11] [12] [13]
Haptics
Video
Audio
Jitter (ms)
≤ 2
≤ 30
≤ 30
Delay (ms)
≤ 50
≤ 400
≤ 150
Packet loss (%)
≤ 10
≤ 1
≤ 1
Update rate (Hz)
≥ 1000
≥ 30
≥ 50
Packet size (bytes)
64-128
≤ MTU
160-320
Throughput (kbit/s)
512-1024
2500 - 40000
64-128
4. The haptic information, video and voice are generated at one party and distributed to all other parties continuously. Note that based on the company security policy, some information is shielded before being distributed to certain participants. For example, George joins the meeting from the factory, which is considered less secure according to the company policy. Consequently some sensitive information is filtered before being distributed to George. Information filtering is typically done at the conference centre (i.e. conference focus).
5. George travels back to office while staying connected on the conference. The connection quality of George’s devices has deteriorated sharply, and the 5G network triggers the codec re-negotiation to maintain the reasonable quality of experience for all participants. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.4 Post-conditions | The 5G system enables efficient communication, with enhanced security and identity management, in support of DVEs for the collaborative and concurrent engineering. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.5 Existing features partly or fully covering the use case functionality | The service requirements on the support of multimedia communication among multiple users have been captured in TS 22.228 [2] with the following key definitions:
Conference: An IP multimedia session with two or more participants. Each conference has a "conference focus". A conference can be uniquely identified by a user. Examples for a conference could be a Telepresence or a multimedia game, in which the conference focus is located in a game server.
Telepresence: A conference with interactive audio-visual communications experience between remote locations, where the users enjoy a strong sense of realism and presence between all participants by optimizing a variety of attributes such as audio and video quality, eye contact, body language, spatial audio, coordinated environments and natural image size.
Telepresence System: A set of functions, devices and network elements which are able to capture, deliver, manage and render multiple high quality interactive audio and video signals in a Telepresence conference. An appropriate number of devices (e.g. cameras, screens, loudspeakers, microphones, codecs) and environmental characteristics are used to establish Telepresence.
Conference Focus: The conference focus is an entity which has abilities to host conferences including their creation, maintenance, and manipulation of the media. A conference focus implements the conference policy (e.g. rules for talk burst control, assign priorities and participant’s rights).
Support of Multi-device and Multi-Identity in IMS MMTEL service is captured in TS 22.173 clause 4.6 [3]:
The support of multiple devices is inherent in IMS. In addition, a service provider may allow a user to use any public user identities for its outgoing and incoming calls. The added identities can but do not have to belong to the served user. Identities may be part of different subscriptions and different operators.
In addition, TS 22.101 [4] has specified in clause 26a a set of service requirements on User Identity:
Identifying distinguished user identities of the user (provided by some external party or by the operator) in the operator network enables an operator to provide an enhanced user experience and optimized performance as well as to offer services to devices that are not part of a 3GPP network. The user to be identified could be an individual human user, using a UE with a certain subscription, or an application running on or connecting via a UE, or a device (“thing”) behind a gateway UE.
Network settings can be adapted and services offered to users according to their needs, independent of the subscription that is used to establish the connection. By acting as an identity provider, the operator can take additional information from the network into account to provide a higher level of security for the authentication of a user.
The 3GPP System shall support to authenticate a User Identity to a service with a User Identifier.
The functional requirement and performance KPIs in support of XR applications are mainly captured in TS 22.261:
- clause 7.6.1 AR/VR;
- clause 6.43 Tactile and multi-modal communication service
- clause 7.11 KPIs for tactile and multi-modal communication service
Clause 8 of TS 22.261 specifies the security related requirements covering aspects such as authentication and authorization, identity management, and data security and privacy.
Additional consideration need to be given to allow multiple users from different geographical locations to interact using XR techniques.
The 5G system is able to collect charging information per UE or per application for the use of IMS based conferencing services. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.6 Potential New Requirements needed to support the use case | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.3.6.1 KPIs for the collaborative and concurrent engineering in product design | [PR 5.3.6.1-1] The 5G System shall provide the appropriate connectivity KPIs for the use case of collaborative and concurrent engineering in product design, see table 5.3.6.1-1.
Table 5.3.6.1-1 – Potential key performance requirements for collaborative and concurrent engineering in product design
Use Cases
Characteristic parameter (KPI)
Influence quantity
Max allowed end-to-end latency
Service bit rate: user-experienced data rate
Reliability
Area Traffic capacity
Message size (byte)
UE Speed
Service Area
Collaborative and concurrent engineering
[≤10] ms
Typical haptic data: [5] ms
(note 1)
[1-100] Mbit/s
([14])
[> 99.9%] ([14])
Typically for Haptic: [> 99.9%] (without compression)
Typically for Haptic: [> 99.999%] (with compression (note 4))
[26]
[3.804] Tbit/s/km2
(note 2)
Typical haptic data:
1 DoF: 2-8
3 DoFs: 6-24
6 DoFs: 12-48
Video: 1500
Audio: 100
([14])
Stationary or Pedestrian
typically
< 100 km2
(note 3)
NOTE 1: The network based conference focus is assumed, which receives data from all the participants, performs rendering (image synthesis), and then distributes the results to all participants. The latency refers to the transmission delay between a UE and the application server.
NOTE 2: To support at least 15 users present at the same location (e.g. in an area of 20m*20m) to actively enjoy immersive Metaverse service concurrently, the area traffic capacity is calculated considering per user consuming non-haptic XR media (e.g. for video per stream up to 40000 kbit/s) and concurrently 60 haptic sensors (per haptic sensor generates data up to 1024 kbit/s).
NOTE 3: In practice, the service area depends on the actual deployment. In some cases a local approach (e.g. the application servers are hosted at the network edge) is preferred in order to satisfy the requirements of low latency and high reliability.
NOTE 4: The arrival interval of compressed haptic data usually follow some statistical distributions, such as generalized Pareto distribution, and Exponential distribution [26].
5.3.6.2 Service requirements for collaborative and concurrent engineering in product design
[PR 5.3.6.2-1] The 5G system shall enhance the interaction between IMS CN and 5G CN to allow 5G CN to provide the IMS CN with real-time feedback in support of XR communication among multiple users simultaneously.
NOTE: The feedback can include information such as network condition, achieved QoS. Such information can be used by the IMS CN, for example, to trigger the codec negotiation.
[PR 5.3.6.2-2] Subject to regulatory requirements, operator policies and user consent, the 5G system shall be able to support mechanisms to expose to a trusted third party (e.g. the conference focus) the result of authenticating a user identity to a UE.
NOTE: Authenticating a user identity to a UE at the terminal side is out of 3GPP scope.
[PR 5.3.6.2-3] The 5G system shall provide a means to synchronize multiple data flows from multiple UEs associated with one user. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4 Use Case on Spatial Anchor Enabler | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.1 Description | In use case 5.1 "Localized Mobile Metaverse Service Use Case," we introduced the term spatial anchor to describe an association between space and service information. This use case elaborates the concept of the spatial anchor to enable diverse mobile metaverse services, including those described in use case 5.1. This use case focuses on the creation, management and use of spatial anchors.
The overall purpose of this enabler is to make it possible to create AR content and share it with others. The 'spatial anchor producer' determines what to share and its location, and any constraints (e.g. who to share the spatial anchor with) and additional information, most importantly the 'resource' associated with the anchor (e.g. AR media, mobile metaverse service to access, etc.).
The 'spatial anchor consumer' is able to recognize anchors associated with locations, and use the spatial anchor to obtain the associated information.
Functionally, a spatial anchor has the following model:
- Spatial Anchor: information that can be provided by a content producer to a content consumer. How this is done is out of scope of this use case.
- Precise spatial location information: where the produced content is located, including the content's extent, orientation, etc.
- Service information: this information is out of scope of standardization but could contain, e.g. a URL, media data, media access information, etc. This information is used by an application to access a service.
The spatial anchor enabler will benefit retail environments. Here, a cheese seller has extensive information about her wares that she will share with customers.
Figure 5.4.1: Spatial Anchors - created by a user to share with other users
In this use case there are several actors that are relevant.
- Leka the cheese seller is the content producer. She creates content and also anchors them to her wares. (She is very adept at putting the cheese back in the same places, and moving the anchors when this is not possible.)
- I am the customer.
- Warez is the mobile metaverse service provider stores that updates Leka's content, generates its presentation (that is, the AR content that is presented to customers), supports any interactive features, etc.
- FineNet is the network operator that enables anchored services for any content producer, customer and mobile metaverse service provider. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.2 Pre-conditions | Leka makes use of a UE that has a subscription with FineNet. She has a sensor that can be used in combination with the UE to indicate precise locations. In figure 5.4.1-1, the sensor can identify the location of the tip of the cheese fork she holds.
Leka obtains services from Warez, where she stores information related to her inventory. She also has arranged in advance what information to display to customers and in what format.
I wear glasses that provide AR experience and communicate by means of my UE. I have a mobile subscription with FineNet also. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.3 Service Flows | Leka places wares on display. She indicates the location of wares that are associated with inventory information so that the association of location (listed below as 'pose' including position in 3 dimensional space, orientation and possibly more spatial data) and service information are captured by the 5GS. This inventory information is captured also by the mobile network operator as spatial anchors. This is shown on the top half of Figure 5.4.3-1.
Figure 5.4.3-1: Spatial Anchor Enabler Service Flow
NOTE: The text in parentheses in the figure above are examples.
1. Creating, modifying, removing spatial anchors
To add new wares to display, Leka captures the location of the item with her cheese for (which includes sensors.) She associates a new spatial anchor with this location and product information (e.g. by scanning the bar code on the cheese). The Warez inventory management system also generates AR content on demand (mobile metaverse media): this application is external to 3GPP standards. Leka's UE accesses the mobile metaverse service that establishes the association that comprises the spatial anchor, the physical location and the service information.
When Leka moves wares, Leka can adjust the spatial location of items or remove them entirely (e.g. when the cheese has been sold) by registering the new location or removal with the Warez inventory management system. The 'location' and 'service information' can be updated over time.
Leka can also update the information that will be displayed as AR (mobile metaverse media) to the customer by the mobile metaverse server offered by the Warez inventory management system, for example the description of the cheese, its price, etc. This interaction is out of scope of 3GPP standards.
2. Accessing and using Spatial Anchors
I enter the store and examine what is on display. I capture the scene with sensors and share this information with the Spatial Mapping and Localization Service Enabler (5.5). This allows me to identify my localization information, including orientation, precisely.
I put on (activating) my AR glasses. By means of my UE with access over FineNet, share the location and orientation information (the area of interest) with the 5G system.
The 5G system uses the localization information to identify all applicable spatial anchors in the area of interest. These are returned to the UE and the AR glasses. This function is illustrated in the lower half of Figure 5.4.3-1.
The service information suffices to access the media server offered by Warez. The location information indicates the location of each spatial anchor.
I now perceive the spatial anchors as shown in Figure 5.4.1.
As Leka indicates the Halloumi in her counter, and my gaze focuses on that location (known to a very high degree of precision), the AR glasses use the service information associated with the spatial anchor to activate the Halloumi cheese media.
I now perceive the AR information panel associated with the Halloumi cheese. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.4 Post-conditions | I am able to observe the AR content (mobile metaverse media) associated with the wares on display, as shown in Figure 5.4.1-1. As wares are moved or removed from the display, the content shifts as well. The display of AR content is the result of the service information (i.e. how to access the media) and the localization information (i.e. where the media is placed, oriented, etc.)
At any time Leka can add new wares and associated AR content, update the content that is displayed, etc.
I perceive AR content associated with the items in the shop and happily buy the cheese that meets my needs. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.5 Existing feature partly or fully covering use case functionality | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.4.6 Potential New Requirements needed to support the use case | [PR 5.4.6-1] Subject to operator policy, the 5G system shall enable an authorized third party to establish an association between a physical location (in three dimensional space, an orientation, etc.) and service information, where the service information is provided to the 5G system and the spatial anchor is either provided or determined by the 5G system.
[PR 5.4.6-2] Subject to operator policy, the 5G system shall be able to establish an association between a physical location (in three dimensional space, an orientation, etc.) and service information, where the service information is provided to the 5G system and the spatial anchor is either provided or determined by the 5G system.
[PR 5.4.6-3] Subject to operator policy, the 5G system shall enable an authorized third party to obtain all of the spatial anchors located in a given three dimensional area.
NOTE 1: How the authorized third party identifies which three dimensional area to request spatial anchors in is not in scope of the 3GPP standard. Spatial localization and mapping information could be used to identify areas of interest.
[PR 5.4.6-4] Subject to operator policy, the 5G system shall enable a third party to request the service information associated with the precise location of a specific spatial anchor. Making use of this service and location information, the third party can access a mobile metaverse server to obtain AR media.
NOTE 2: How the service and location information is used by the third party to access a mobile metaverse server and the AR media itself is out of scope of this requirement.
[PR 5.4.6-5] Subject to operator policy, the 5G system shall provide an authorized third party a means to manage the spatial anchor(s), e.g. add, remove or modify spatial anchors, determine privacy and security aspects, and specifically to enable the third party to define which spatial anchors they manage have restricted access conditions.
[PR 5.4.6-6] The 5G system shall be able to collect charging information for the establishment or management of an association between a physical location and service information, where a third party creates, deletes or changes a spatial anchor or associated service information.
[PR 5.4.6-7] The 5G system shall be able to collect charging information associated with the network operator exposure of spatial anchors to authorized third parties, and of service information associated with spatial anchors.
NOTE: The preceding requirements assumes that exposure of network anchors and associated service information can be a service provided by a network operator to third parties. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5 Use Case on Spatial Mapping and Localization Service Enabler | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.1 Description | Spatial mapping is constructing or updating a map of an unknown location and localization is tracking an object to identify its location and orientation over time.
For the localized mobile metaverse use case 5.1, the service provider or operator needs to provide and use spatial map information, i.e. a 3D map of indoor or outdoor environment. This use case considers how a spatial map can be created and employed, both as service enablers. The creation and maintenance of the spatial map is referred to as Spatial Mapping Service and the employment of the map to identify the customer's Localization is termed Spatial Localization Service.
Spatial mapping will classify objects into modelling and tracking of stationary and moving objects. For stationary object, spatial mapping has to estimate the number of objects, type of object and position. Whereas for moving objects, spatial mapping has to determine the position, type of object, direction, speed. Once the spatial mapping service has sufficient information, it has to map all the stationary and moving objects related to UE’s environment. This information can be provided to the UE, service providers and surrounding subscribed users as well [17,19,24]
Figure 5.5.1-1: Spatial Location Services Enabler
Specifically, this use case proposes two spatial localization service enablers.
1) Spatial Mapping
Sensing data gathered transparently is processed in order to identify the static and transient forms. For example, in Figure 5.5.1-1
(A), the Rubens room in the Louvre, sensing data captures information.
(B) This information is processed to identify the static and transient information, to establish an up-to-date spatial map.
(C) In combination with location information (for the sensors) and other information (e.g. architectural specifications of the Louvre), a spatial map can be achieved in which not only the forms but also their locations in 3D space are known.
2) Localization
Given that a spatial mapping exists,
(A) sensing data for a user (from devices communicating by means of a UE) can be captured,
(B) compared with the spatial map,
(C) used to identify in 3D space the position, viewing direction, angle, etc. of the user.
Spatial Mapping Considerations
Examples when spatial mapping could be useful:
- A government conducting a digital city project can build a 3D spatial map of outdoor environment of an entire city or public spaces such as outdoor parks or indoor offices of their government building.
- A navigation service provider (or Spatial Localization Service) can build a 3D spatial map of outdoor environment of entire roads or public spaces. An operator’s partner or operator also can build a 3D spatial map of outdoor environment.
- A customer wanting mapping of their indoor environment, e.g. the interior of a commercial space such as the cheese shop as described in 5.4.
It is a hard task to perform the mapping of the entire city using a vehicle by traversing various roads and spaces. It also requires lots of time and effort (for data conversion, etc.) if the work is performed in offline. If the multiple capturing devices are used in parallel, the spatial map data in the same location could be synthesized over the different cameras and input devices to generate the spatial map.
The mobile capturing device, vehicle or robot equipped with multiple stereo/mono RGB cameras and multiple LiDAR to capture various qualities of images and depth information of the environments. As an example in [15], a mobile indoor robot is equipped with two LiDARs, 6 industrial cameras and 4 smartphone cameras. Based on the example, we can derive the uplink bandwidth requirements for one mobile indoor mapping robot. There is a corresponding use case for 'transparent sensing' in TR 22.837.
Localization Considerations
The environment mapping can be used for providing a visual positioning service, for enhancing the accuracy of location service or for helping the metaverse contents management system for spatial internet.
In this use case, the UE provides uplink sensor information that can be interpreted, along with a spatial map, to identify Localization. This is analogous to the process used by the UE to provide sensor data that the 5G system can use to determine UE location for Location Services. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.2 Pre-conditions | For Spatial Mapping
Mobile operator or service provider can determine the target area for the spatial mapping. The target area can be divided into multiple areas where each can be mapped with multiple capturing devices (e.g. capturing indoor robot, vehicles).
Each capturing device is equipped with multiple sensors, e.g. mono or stereo RGB camera or one or more multiple LiDAR cameras. The intrinsic parameter of cameras equipped with the device are pre-configured in the device.
Each capturing device has a capability with high resolution positioning system, the positioning technique also can derive the altitude information.
An indoor robot can be equipped with non-3GPP based high resolution positioning system such as UWB. To derive the exact the position of the capturing device, each capturing device also can utilize the structure-from-motion technologies.
The capturing devices communicate via a UE that can access the mobile network of the MNO that supports spatial mapping. The capturing devices can either sense the spatial area to be mapped, or move about sufficiently that the capturing devices can be used to acquire sensing data corresponding to the area.
For Localization
A set of sensors are accessible by a UE. These could be built into the terminal equipment or communicate with it in some way that is out of scope of this use case, e.g. using a cable, a personal area network, etc.
The UE can access a mobile network of MNO M that offers localization services.
In 3GPP-R16, bistatic localization is performed on both downlink and uplink using time-difference-of-arrival (TDoA), angle-of-arrival (AoA), and angle-of-departure (AoD) measurements both at the base station. Both bistatic and monostatic have limitation with their accuracies, with the advent of high BW mmWave brings high resolution in both time-delay and angle domains. The spatial mapping problem can be divided into front and a back-end problem. In front-end, the problem is to determine the data association - between landmarks and measurement directions. The back-end problem is to find the probabilistic SLAM density for a given data association- determined in the front end. Most of the back-end algorithms are Kalman filter (EKF), Fast-SLAM and Graph-SLAM, which are based on Bayesian method [21, 22, 23].
5.5.3 Service Flows
For Spatial Mapping
1. A capturing mobile device attaches to the mobile network and becomes authorized to deliver sensing data for the purposes of spatial mapping.
2. The capturing mobile device starts the mapping operation.
a. The capturing mobile device arranges for its sensors to provide information as needed by the 5G system. This could require some configuration of the capturing mobile device, e.g. the sensors or to control the uplink communication of sensing data.
3. The capturing mobile device navigates, e.g. with a pre-defined route, within the selected target region in order to provide a sufficiently complete set of sensor information for the space to be mapped.
4. The capturing mobile device uploads the captured RGB camera images and LIDAR depth images to the 5G system with the positioning information of the mobile device.
5. The mapping server collects and analyzes the information provided to cumulative create or update a spatial map.
For Localization
1. A mobile device requiring localization attaches to a mobile network and becomes authorized to obtain Localization services.
2. The mobile device uses sensors to capture information corresponding to the point and direction that has to be localized.
3. The mobile device sends via uplink communication the sensing data to the 5G system.
4. The 5G system uses the sensing data and a spatial map to determine the localization, that is, the corresponding positioning information and sensor pose.
5.5.4 Post-conditions
For Spatial Mapping
The spatial mapping enabler re-structures the data provided by mobile capture devices to create a spatial map. This information could be combined with other information available, e.g. floor plans of buildings, survey data, satellite images, etc. Over time sufficient information will be captured to allow the spatial mapping enabler to distinguish between static and dynamic objects in the environment.
For Localization
The result of the localization service is available as a service enabler. This could be provided to the UE for use by applications or exposed to a third party by means of an API by the 5GS. This information can be used for many purposes, especially for media based applications that require localization to control the rendering of AR or MR content. Localization can be done over time, e.g. to track a user's movement.
The localization enabler can enhance location services as it can identify with some degree of precision a location by means of diverse sensors compared with known location data in the spatial map. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.5 Existing feature partially or fully covering use case functionality | Location Services provide location information which can be used to support services that are triggered or informed by the user's location. This information is not at the degree of accuracy needed for some applications, e.g. AR. Location Services also do not work in all environments, e.g. indoor.
The 5GS supports uplink media and uplink sensor data communication.
In clause 4.1.4 of TR 26.928 [18] - XR spatial mapping and localization - has been proposed. This use case specifies the key areas of XR and AR are spatial mapping, which is creating a map of the surrounding areas, and localization - positioning the user in that map. This use case depends on multiple external sensors such as monocular/stereo/depth cameras, radio beacons, GPS, inertial sensors, etc., There is no requirement on how 5G sensing can be used for spatial mapping and localization services.
Positioning in 5G Networks been proposed in 3GPP release-16, it specifies positioning signals and measurements for the 5G NR. In release-16, 5G Positioning architecture extends 4G positioning architecture by adding Location Management Function (LMF) and Transmission reception points (TRP). 5G- along with enables- provides new positioning methods based on multi-cell round-trip time measurements, multiple antenna beam measurements, multiple to enable downlink angle of departure (DL-AoD) and uplink angle of arrival (UL-AoA) [M, 21, 22]. Current 5G system supports positioning of the device-based but not device-free – Objects that do not radiate EM signals. We propose to add new requirement in section- 5.5.6 to extend this architecture to support spatial mapping and localization service enablers. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.6 Potential New Requirements needed to support the use case | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.6.1 Requirements for Spatial Mapping | [PR 5.5.6.1-1] Subject to operator policy and relevant regional and national regulation, the 5GS shall support mechanisms for an authorized UE to provide sensing data that can be used to produce or modify a spatial map.
[PR 5.5.6.1-2]
Subject to operator policy, user consent and relevant regional and national regulation, the 5GS shall support mechanisms to receive and process sensing data to produce or modify a spatial map.
[PR 5.5.6.1-3]
Subject to operator policy and relevant regional and national regulation, the 5GS shall support mechanisms to expose a spatial map or derived localization information from that map to authorized third parties.
NOTE 1: The spatial map and derived localization information supports services that produce AR and MR media, e.g. as described in clause 5.1.
NOTE 2: The precision of spatial positioning of sensors that provide sensing data used to create or modify the spatial map is not specified. This could be revisited in future as more experience accumulates with spatial mapping services.
[PR 5.5.6.1-4] The 5G system shall support the collection of charging information associated with the exposure of a spatial map or derived localization information to authorized third parties.
[PR 5.5.6.1-5] The 5G system shall support the collection of charging information associated with the production or modification of a spatial map on behalf of an authorized third party. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.5.6.2 Requirements for Localization | [PR 5.5.6.2-1] Subject to operator policy and relevant regional and national regulation, the 5GS shall support mechanisms to authorize Spatial Localization Service.
[PR 5.5.6.2-2] Subject to operator policy and relevant regional and national regulation, the 5GS shall support mechanisms for an authorized UE to provide sensor data that can be used to for Spatial Localization Service.
[PR 5.5.6.2-3] Subject to operator policy and relevant regional and national regulation, the 5GS shall support mechanisms to expose Spatial Localization Service information to authorized third parties.
NOTE 2: The Spatial Localization Service information supports services that produce AR and MR media, e.g. as described in clause 5.1.
[PR 5.5.6.2-4] The 5G system shall support the collection of charging information associated with exposing spatial location service information to authorized third parties.
5.6 Use Case on Mobile Metaverse for Immersive Gaming and Live Shows |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.1 Description | The mobile metaverse combines the physical and digital world. Mobile metaverse services have already gained significant attention and will benefit multiple areas, such as gaming, social, medical, industry, transport, and so on [27]. Mobile metaverse services will bring more immersive user experience, which will bring more potential requirements to 5G systems. Among these fields, gaming is considered a pioneer in the development of mobile metaverse services. This use case aims to discuss mobile metaverse services for immersive gaming and live shows.
With the support of 5GS, game players can interact with each other on the cloud or edge server, which may form a digital world we term a mobile metaverse. Figure 5.6.1-1 shows the general idea of this use case. The mobile metaverse service may be deployed at the cloud or edge server for immersive gaming and live shows. When the players are playing a basketball game, they may achieve an immersive experience with their avatars, and the avatars can interact with each other, whether the players are in proximity or non-proximity. Meanwhile, other players in the metaverse can join in this digital world as spectators to watch the live show.
The sensor data obtained by the cloud or edge server may perform coding and rendering to generate the digital representation for immersive gaming and live shows, which may be displayed (as if) on a big screen, and the interactive service data could be exchanged among the players, avatars. Here, sensing data include the physical pose and gestures including movement. For a basketball game, the court and surrounding facilities also can have sensor. The sensor data obtained from these sensors is useful for the metaverse to determine how to perform 3D digital representation of the participants and setting. An immersive user experience could be provided for the players and their audience. The major impact on 3GPP is whether and how 5GS can be used to better utilize the sensor data and achieve immersive experiences of the multiple players.
Fig 5.6.1-1 Mobile Metaverse for Immersive Gaming and Live Shows |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.2 Pre-conditions | The following are pre-conditions for this use case:
1) The computing resource used for game design and real-time processing such as game development library and rendering tools could be provided for mobile metaverse on the cloud or edge server.
2) 5GS is capable of transporting the uplink/downlink service data.
The VR/AR/MR/Cloud Gaming mobile devices, such as mobile headsets or other haptic mobile devices, could be connected to the cloud or edge server for supporting the mobile metaverse immersive game and live show via 5GS. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.3 Service Flows | The following are service flows for this use case:
1) Player A (a lady who is a fan of the gaming mobile metaverse service) configures her smartphone (a UE.) One path is established between the cloud or edge server and the UE. The UE sends a request to the 5GS, and the 5GS authorizes the request and exposes the capability of mobile metaverse game production and game development (e.g. expose APIs) to the UE. Player A controls her avatar and performs game production and game development based on the rules made by her and the game development or material library stored on the cloud or edge server. She creates a basketball game venue, game rules, NFT characters (with player attributes), supermarkets, etc., all stored and run on the cloud or edge server.
NOTE: The edge may be a burden when it comes to long-term services, such as mobile metaverse service data storage and large-scale computing. In such a case, cloud service is essential for being a centralized node to maintaining shared space for thousands or even millions of concurrent users in such a large scale mobile metaverse service, and cloud-edge interaction via 5GS is necessary.
2) Player A invites seven players (B, C, D, E, F, G, H) who are participating in the mobile metaverse service to join the game venue created by herself. These players form two teams to play a 3v3 basketball game match. Among them, B, C, and D are one group, E, F, and G are another group, and H is the game referee. Each member of each group chooses their own digital representatoin. Then, player A publishes the game match information, and other players as spectators in the metaverse can enter the game venue to watch the match.
NOTE: Players A, B, C, D, E, F, G, and H can be located in different areas in the physical world.
3) The game starts. When the players are playing in a venue realized as a mobile metaverse service. 3D positioning accuracy is required for the digital representations (avatars) that represent the players’ location and also gestures in a basketball game. The team members in the physical world control the (digital representation of the) basketball through 5GS in the uplink direction by means of a typical mobile input device, e.g., VR headset, VR glasses. At the same time, the players can interact with each other and pass the basketball, etc.; though the player has no actual contact with a basketball in the physical world, he can get some haptic experience of the basketball. The team members both in the physical world and digital representation can interact with each other via 5GS anytime, anywhere for an immersive experience.
4) Spectators can watch the game match through 5G system by a typical mobile device. At the same time, the spectators can view diverse content such as the game attributes, including game rules and player information, by switching the viewing direction. Multiple mobile metaverse media can be provided to spectators an immersive live show experience through 5G system. The spectators can interact with each other via 5G system for an immersive experience.
NOTE: During the running of the game, UE can access mobile metaverse services with low power consumption to reduce the metaverse game interruption. The cloud or edge server is used for coding, rendering, and generating the mobile metaverse media for immersive gaming and live show mobile metaverse services.
5) Player A terminates the game application. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.4 Post-conditions | The following are post-conditions for this use case:
The players in the game match achieved an immersive game experience by means of a mobile metaverse serivce enabled by 5GS.
The spectators in the game match had an immersive live show.
The 5GS can address and meet the mobile metaverse service game requirements with the cloud or edge side.
Players A, B, C, D, E, F, G, and H may earn money from the game, a mobile metaverse service. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.5 Existing features partly or fully covering the use case functionality | In clause 6.43.2 of 3GPP TS 22.261, there are the following requirements:
The 5G system shall enable an authorized 3rd party to provide policy(ies) for flows associated with an application. The policy may contain e.g. the set of UEs and data flows, the expected QoS handling and associated triggering events, and other coordination information.
The 5G system shall support a means to apply 3rd party provided policy(ies) for flows associated with an application. The policy may contain e.g. the set of UEs and data flows, the expected QoS handling and associated triggering events, and other coordination information.
NOTE: The policy can be used by a 3rd party application for the coordination of the transmission of multiple UEs’ flows (e.g., haptic, audio, and video) of a multi-modal communication session. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.6.6 Potential New Requirements needed to support the use case [PR 5.6.6-1] The 5G System shall support the transmission of uplink sensor data transmission and downlink feedback information with stringent requirements on packet delay and bandwidth for real-time interaction. | [PR.5.6.6-2] The 5G System shall support a mechanism to obtain the location and gestures of the players with stringent requirements on 3D positioning accuracy.
KPI requirements related to the potential requirements:
Table 5.6.6-1 – Potential key performance requirements for immersive gaming and live shows
Use Case
Characteristic parameter (KPI)
End-to-end latency
Service bit rate: user-experienced data rate
Reliability
Positioning accuracy
Mobile Metaverse for immersive gaming and live shows
[5~20] ms
[1~1000] Mbit/s
[>99.99%]
[<1] m |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.7 Use Case on AR Enabled Immersive Experience | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.7.1 Description | In addition to watching movies at the cinema, people will also choose to watch a movie on their mobile phones, laptops or TVs when they don’t have time to go to the cinema, e.g. when travelling or at home. However, by watching through these terminals, users will feel more or less uncomfortable in the neck or cervical spine because the users always keep their necks down. Moreover, the screen is relatively small; if users want to see more realistic screen details or watch an immersive 3D movie, using these terminals is not feasible.
Figure 5.7.1-1: AR Enabled Immersive Experience (image source: www.indiegogo.com)
In this use case, users can get an immersive location agnostic service experience of watching movies in certain circumstances, such as at home or on the train. They can even invite some of their friends to watch a movie at different place simultaneously by wearing a wearable device, such as AR glasses. A large screen like a movie theatre will be presented in the field of vision (FOV) of the wearable device, which not only provides an immersive watching experience like private cinema but also has very low demands on the environment and space of the user's location, 3D cinematic effects can also be easily rendered in the device.
To achieve an immersive experience location agnostic service through AR glasses, the 5G system is required to provide a reliable transmission of uplink and downlink data and a way for users to synchronize their experience and interact together. In mobility scenarios, e.g., when a user is travelling on the train, the continuity of data transmission also needs to be guaranteed. Moreover, when AR glasses are wireless, the power supply relies on the battery integrated inside the AR glasses. This use case investigates how the 5G system (through direct device connection or NG-RAN) can be used to support UE to establish data connection with the mobile metaverse server. The 5G system can provides services to AR glasses so as to minimize energy consumption in the overall system.
The service dataflows and requirements may differ depending on whether the AR glasses are accessing the service through a direct device connection or NG-RAN. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.7.2 Pre-conditions | Bob wants to watch an AR movie with friends to relax while travelling by train. So he wears wearable equipment such as AR glasses that can access into the 5G network (through direct device connection or NG-RAN) to access metaverse services.
Bob has subscribed to an immersive movie service as a mobile metaverse service that he can access via AR glasses. The service gives Bob access to a large movie catalog (2D/3D). The battery capacity of the AR glasses is enough to watch a two-hour movie. Before starting the movie, Bob can invite some friends to join him. If people join Bob, their avatars are also placed into his FOV and Bob can interact with them (speech or text).
Considering the wearing comfort of users, mainstream AR glasses should not be too heavy (normally less than 150 grams). The maximum capacity of the battery (50 grams) is about 1000mAh. Usually, 25% of the battery capacity of AR glasses is allocated to the mobile termination module.
When a user watches a 4K movie, some video compression techniques are usually used to reduce the amount of data transmitted while maintaining the image quality.
In general, a large compression ratio will cause a delay increase. Considering the overall factors of delay and energy consumption, using AR glasses with a direct device connection would require a low compression ratio, for instance, 3:1 [50]. However, some advanced AR glasses SoC embeds hardware video decoders (e.g., AVC, HEVC, and VVC) and can render viewport efficiently.
A study on energy consumption of hardware video decoders [59] shows that a typical HEVC hardware decoder embedded on an Android device is spending 40-50mA per hour of video playback (decoding only). The usage of hardware decoders seems reasonable, given the 1000mAh battery capacity assumption made in the current description. For the NG-RAN case, it is reasonable to think that AR glasses can decode and render efficiently with low energy consumption. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.7.3 Service Flows | 1. Data connections are set up between AR glasses and the Metaverse server, which can provide immersive location agnostic service experience of a movie service. The 5G module can be connected to the 5G network either via direct device connection or NG-RAN.
2. The mobile metaverse server provides movie access to the client AR glasses through the downlink data stream.
3. The mobile metaverse server manages communications between clients (friends), e.g., including video, avatar, speech, and text.
4. The mobile metaverse server manages synchronization between the clients (e.g., the various AR glasses) of the friends. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.7.4 Post-conditions | Bob is able to watch an immersive movie with friends while travelling, obtaining a good user experience. The battery capacity of the AR glasses is enough to watch a two-hour movie.
Bob is able to communicate with them in a synchronized manner. Bob’s friends (or avatars) can be visible in his FOV.
The 5G system is able to support communication for immersive location agnostic AR services, providing a reliable transmission, a continuity of service and a synchronized experience across users sharing a viewing experience. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.