Video hosted by Apple at devstreaming-cdn.apple.com

Configure player

Close

WWDC Index does not host video files

If you have access to video files, you can configure a URL pattern to be used in a video player.

URL pattern

preview

Use any of these variables in your URL pattern, the pattern is stored in your browsers' local storage.

$id
ID of session: wwdc2025-302
$eventId
ID of event: wwdc2025
$eventContentId
ID of session without event part: 302
$eventShortId
Shortened ID of event: wwdc25
$year
Year of session: 2025
$extension
Extension of original filename: mp4
$filenameAlmostEvery
Filename from "(Almost) Every..." gist: ...

WWDC25 • Session 302

Create a seamless multiview playback experience

Audio & Video • iOS, macOS, tvOS, visionOS • 19:34

Learn how to build advanced multiview playback experiences in your app. We’ll cover how you can synchronize playback between multiple players, enhance multiview playback with seamless AirPlay integration, and optimize playback quality to deliver engaging multiview playback experiences.

Speaker: Julia Xu

Open in Apple Developer site

Transcript

Introduction

Hi everyone, I’m Julia, an AVFoundation engineer. In this video, I’ll discuss how to create an engaging user experience in your app across multiple players. People love to get multiple perspectives from live events like sporting competitions or watch multiple channels simultaneously. A multiview playback experience consists of playing multiple streams of audio and video at once.

One use case is playing multiple different streams of the same event. For example, a soccer game with an audio stream for the announcer and two video streams with different perspectives of the field. In this case, it’s important to synchronize playback between the streams. This way, important moments line up. Other examples of synchronized streams might include a music concert which has multiple camera angles, or a keynote speech that has both a main content stream along with a corresponding sign language stream.

Another multiview use case is playing multiple different streams of completely different events. For instance, showing streams of different events like Track and Field and Swimming during the Olympics, with some background music. in these cases, the audio and video streams do not have to be synchronized with each other.

AVFoundation and AVRouting have API that makes it easier to build rich, multiview playback experiences. I’ll go over these API in this video I’ll start with how to synchronize playback across multiple streams. then discuss how to handle routing across multiple views for AirPlay. Finally, I’ll share how to optimize playback quality across multiple players.

Multiview playback coordination

When showing multiple streams that need to be coordinated, such as for a sports game, it’s critical to synchronize playback across all of the players so that important moments line up. This means that all playback behaviors like play, pause, and seek need to be coordinated. However, this can be a complicated process. In addition to rate changes and seeks, complex behaviors also need to be managed.

The AVPlaybackCoordinationMedium, from the AVFoundation framework, makes it easier to tightly synchronize playback across multiple players. It handles the coordination of rate changes and time jumps, as well as other complex behaviors like stalling, interruptions, and startup synchronization. I’ll demonstrate how to use the “AV Playback Coordination Medium” to coordinate between multiple players in your app.

In the demos I’ll show throughout this video, I’ll use the example of different camera angles of a train on tracks moving through different scenes such as plants and other objects and landmarks. This train example helps to illustrate what multiview content could look like with multiple camera angles. In the coordination demo, I’ll be adding in multiple camera angles that were filmed from around the train track just as if I were watching a sports game and wanted to add in different camera angles from a game. This is an iPad app with several different video streams of a toy train moving around the track.

I start with a birds-eye view of the train track playing. I want to see more angles of the train, so I add in a side view of the track. The second stream matches up with the currently playing stream. I’ll also add in two more camera angles recorded from around the track.

Each additional stream with join in sync. If I pause, all players will pause in sync. Taking a closer look, I can see the train from multiple angles in all four players. In the top left, bird's-eye view, I notice the train is positioned near the top edge of the table by the plants. From the other camera angles, I can see that the train is beginning to enter the straight part of the tracks and is approaching the monkey from behind.

Looking at the timestamp on each video, I can confirm that all are at the same time. Now, I’ll play and all the players with begin playback in sync. I can watch the train in perfect coordination from the various angles. Next, I’ll issue seek forwards by 10 seconds. With each of the actions, the players remain in sync.

Even if I leave my iPad app and switch to a Picture-In-Picture view. the streams remain synchronized. If I return to the app, all of the videos are still playing in sync. This also works great across system interfaces, like with the Now Playing interface. Playback behaviors are also coordinated. I can pause and play and the players remain in sync.

Coordinating across all players creates a great user experience. In the demo, I’ve showed an example with a train moving through landmarks. In a real world scenario, this could be sports events, sign language streams, or other multiview use cases where you want to coordinate playback. Now that I’ve gone over what it looks like in action, I’ll show how you can build this experience in your app.

The “AV Playback Coordination Medium API” builds on the existing AVPlaybackCoordination architecture that exists for SharePlay. Each AVPlayer has an AVPlaybackCoordinator that negotiates between the playback state of the player and all other connected players. To learn more details about the playback coordinator and how it works, check out the video “Coordinate media experiences with Group Activities”.

If there are multiple video players, the playback coordinator needs to handle remote state management and make sure that each player is in sync with the others. The “AV Playback Coordination Medium” communicates state changes across all playback coordinators. The coordination medium passes states from one coordinator to the other playback coordinators, and keeps them all in sync.

This is achieved through messaging. The coordination medium passes messages between players for important state changes like playback rate, time, and other state changes. For instance, if the one player pauses, it sends that message to the coordination medium. Then, the coordination medium will send this to all other connected playback coordinators. The playback coordinators will handle and apply the playback state. This way, all players are able to stay in sync when playing coordinated multiview content.

Implementing this only takes a few lines of code. I start by setting up my AVPlayers, each with a different asset. Here, I’m using two videos. One for a close-up shot, and one for a bird’s eye shot. I’m configuring these separately with different assets. Next, I create the coordination medium. Then, I connect each player to the coordination medium by using the coordinate method. This method can throw errors, so it’s important to handle them.

Finally, I’ll do the same for my second player, the bird's-eye shot. Now, both playback coordinators are connected to the coordination medium, and the actions on each player will be synchronized. All I have to do is call an action on one player, and all of the other connected players will do the same. In this example, I only used 2 players, but you can connect more players. The AVPlaybackCoordinationMedium is great for coordinating multiview playback. Next, I’ll talk about tools that can apply to any type of multiview playback, both coordinated and non-coordinated. AirPlay enables an awesome external playback experience.

Support AirPlay in an app with multiview experiences

People can route video streams to a larger screen in their home, or route an audio stream to a HomePod. It’s important to route the right view to the right device. I’ll go over how to support AirPlay in your app with multiview experiences. Let’s see that in action! In the example I’ll show, I’m watching a bird's-eye view video along with a track close-up video.

Both videos are playing on my iPad. Both videos are playing on my iPad. However, I want to AirPlay my video from my iPad to the Apple TV. The AirPlay receiver can only support a single stream, so if I route to it, I prefer the bird's-eye view video to play on the big screen so I can see all the details more clearly.

I begin playback of two videos on my iPad and route to an Apple TV. When I do so, the close-up view will continue to play on my iPad. And… The birdseye view video will be played on the big screen since it’s my preferred player. If I want to change which video is playing on the TV, I can switch between streams by updating the preferred player to be the closeup stream and the two videos will switch places.

On this iPad app, this is done through pressing the star button, which will set the close-up video to be the preferred player. Now, I select it. And the close-up stream will play on the TV, while the bird’s eye view plays on the iPad. Additionally, I can pause and play, and if the streams are coordinated, then, they’ll remain in sync.

This is an example of a coordinated playback use case. However, uncoordinated multiview streams also work seamlessly with AirPlay. The AVRoutingPlaybackArbiter, which is part of the AVRouting framework, enables you to easily integrate AirPlay support for multiview experiences. The playback arbiter ensures that multiview works smoothly with AirPlay or other external playback experiences that only support a single video or audio stream.

It manages the complexities of switching to the correct video or audio stream. The AVRoutingPlaybackArbiter is responsible for managing and applying preferences on non-mixable audio routes. These are audio routes where only a single audio stream can be played and concurrent audio playback on the receiver is not possible. The playback arbiter also handles constrained external playback video routes. These are routes where only a single video stream can be played on the receiver, such as with AirPlay video and Lightning Digital AV Adapters.

In a multiview playback case, such as with the train multiview videos, I might have a bird's-eye view video and multiple close-up shots. I want the bird's-eye view to take priority whenever I AirPlay video. First, I obtain the playback arbiter singleton. Next, set the bird's-eye view as the “preferredParticipantForExternalPlayback”, a property on the playback arbiter. Now, if I route to an Apple TV from my iPad while playing multiview content, The bird's-eye view routes its video to the Apple TV while the other videos continue to play locally on my iPad.

Similarly, if there are multiple players and bird's-eye view should take audio priority, then, first obtain the playback arbiter singleton and set the bird's-eye player as the “preferredParticipant ForNonMixableAudioRoutes”. If multiview content is playing and I route audio to a HomePod from my iPad, the audio of the bird's-eye view will be played. Next, I’ll show an example of how to use this API.

First, I set up two AVPlayers, one of the close-up shot and one of a bird's-eye view. Then, I obtain the playback arbiter singleton from an instance of the AVRoutingPlaybackArbiter. I want to see the bird's-eye video on the big screen whenever I route to AirPlay, so I set it as the preferred participant for external playback And I want to hear its audio if I route to a HomePod, so I also set it as the preferred participant for non mixable audio routes.

In this example, I’ve chosen the same player for both properties, but this can be set to be different players. Through the AVRoutingPlaybackArbiter, ensure a seamless integration of AirPlay and other external audio and video playback experiences in your multiview app. Next, I’ll tell you how to manage the quality of these streams.

Optimize quality of the streams in multiview

When watching multiview content, some streams may be more important than other streams. For example, when watching a sports game in multiview one stream might be a bird's-eye view of the field. Two other streams could be of different perspectives of the field and another stream could be close-up views of the crowd.

In this example, I care more about the bird's-eye view of the field. I want to see it more clearly and have it play at a higher quality. I care less about the close-up views of the crowd, so I don’t need to see it in detail and don’t mind if it plays at a lower quality. In a multiview scenario, different players may have different quality needs. Indicate this through setting the AVPlayer’s networkResourcePriority. I’ll discuss in detail how this works. When streaming content, each player consumes network bandwidth.

If these players were equal sized, you may want each to consume an equal amount of network bandwidth and play at the same quality. However, each player may have different network bandwidth and quality needs. To support this, set the networkResourcePriority of the AVPlayer. Each player starts with a default priority level.

You can set the priority level to high or low. A priority level of high means that the player requires a high level of network resources and streaming in a high-quality resolution is crucial. A priority level of low means that the player requires minimal network bandwidth and streaming in high-quality resolution is not as crucial.

I’ll walk through an example of how you might achieve this with the networkResourcePriority First, create an AVPlayer, and then set the player’s networkResourcePriority. In the sports game example, the field bird's-eye view is most important, so I set that priority to high. The crowd close-up view is less important, so I set it to low.

As a result, the field bird's-eye view will see a higher network priority while the crowd close-up view will see a lower one. These network priorities are there to help indicate the priority of the player when the system allocates network bandwidth resources. The exact network bandwidth distribution takes a variety of other factors into consideration such as number of other players, video layer size, hardware constraints, and more. Next, I’ll show a demo of network priorities in action.

In this example, I’ll show the train multiview example, and this can extend to the sports game example and other multiview scenarios, where different playback qualities are required. I’m watching two different streams - a bird's-eye view of the train track and a close-up view of the train. It’s important for me to watch the train without missing a moment, so I really want to see the bird's-eye view clearly. I set the network resource priority of the bird's-eye view to high.

Both videos are currently playing in high resolution. The resolution tags are at the bottom of the videos. If I encounter poor network conditions and network bandwidth is limited, the close-up view on the right will switch down to a lower resolution first. I can see that happen now. The *more important* bird’s-eye view on the left will maintain a high-definition resolution. Through setting the network resource priority of a player, you have greater control over the quality at which a stream plays.

The AVFoundation and AVRouting API that I’ve discussed all work together to enable you to build seamless multiview experiences. Now that you’ve seen these advanced multiview features involving playback coordination, AirPlay integration, and quality optimization, build and enhance your own app with multiview. Use the AVPlaybackCoordinationMedium to create compelling synchronized multiview experiences. Synchronize multiple camera angles from your favorite sporting event.

Explore the AVRoutingPlaybackArbiter to enhance a multiview app with AirPlay integration. Take multi-stream playback, such as ASL streams, to the big screen. Fine-tune and optimize playback quality through network bandwidth allocation. Ensure important streams are playing in high quality. We look forward to all the exciting multiview playback experiences you create. Thank you for watching!