Cinemachine 源码解析

核心概念和基础流程

核心概念

CinemachineCoreCinemachineCore 是一个全局单例,是管理所有 CinemachineBrain 和虚拟相机的数据结构 Manager,同时提供虚拟相机计算更新的接口和事件的分发等。

CinemachineBrainCinemachineBrain 是用于驱动虚拟相机的更新计算的 Manager,会绑定一个实体相机(一般情况就是挂在 MainCamera 所在的 GameObject 上),并最终将当前激活的虚拟相机结果作用到实体相机上,同时还负责更新计算两个虚拟相机的 Blend 混合计算。

CinemachineVirtualCameraCinemachineVirtualCamera 是通俗意义上的虚拟相机,相机的「配置文件」,根据「配置」指定相机该如何运动,其下组合了一组 Component Pipeline,分成BodyAimNoise 等几个管线阶段,每个阶段可以选择一种组件算法:

  • Body:用来控制相机的位置移动
  • Aim:用来控制相机的方向旋转
  • Noise:用来添加相机的晃动计算
  • Finalize:将以上结算结果叠加作用到实体相机

CinemachineExtension:在虚拟相机上层计算的过程中,在上层下发下来的时机回调中插入一些额外的计算,比如在 Component Pipeline 开始计算之前时机、Component Pipeline 每个 Stage 阶段计算结束时机等。

更详细的解释和使用见:Cinemachine Camera详细讲解和使用

基础流程

Cinemachine 的大体执行流程可以概括为如下:

下面以 CinemachineBrain.ManualUpdate 为入口,展开上述三个流程的代码逻辑

计算混合

计算混合的逻辑主要在 CinemachineBrain.ManualUpdate 中的 UpdateFrame0ComputeCurrentBlend 两个函数中:

1
2
3
4
5
6
7
8
9
10
11
12
public void ManualUpdate()
{
m_LastFrameUpdated = Time.frameCount;

float deltaTime = GetEffectiveDeltaTime(false);
if (!Application.isPlaying || m_BlendUpdateMethod != BrainUpdateMethod.FixedUpdate)
UpdateFrame0(deltaTime);

ComputeCurrentBlend(ref mCurrentLiveCameras, 0);

...
}

在理解这两个函数的逻辑之前,先了解一些前置概念:

  • BrainFramemFrameStack
    • BrainFrame 存储 CinemachineBrain 在一帧内要处理的数据,由代码可知主要是混合相关的数据
    • mFrameStack:存储 BrainFrame 的若干实例,基本只使用这个集合内的第一个元素进行处理(后面的元素应该是给 Timeline 准备的?),UpdateFrame0 就是更新这个集合内的第一个元素
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
class CinemachineBrain
{
...

private class BrainFrame
{
public int id;
public CinemachineBlend blend = new CinemachineBlend(null, null, null, 0, 0);
public bool Active { get { return blend.IsValid; } }

// Working data - updated every frame
public CinemachineBlend workingBlend = new CinemachineBlend(null, null, null, 0, 0);
public BlendSourceVirtualCamera workingBlendSource = new BlendSourceVirtualCamera(null);

// Used by Timeline Preview for overriding the current value of deltaTime
public float deltaTimeOverride;

// Used for blend reversal. Range is 0...1,
// representing where the blend started when reversed mid-blend
public float blendStartPosition;
}

// Current game state is always frame 0, overrides are subsequent frames
private List<BrainFrame> mFrameStack = new List<BrainFrame>();

...
}
  • CinemachineBlend:存储混合计算的相关数据,比如用于混合的相机、混合的持续时间、混合的当前进度等。
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
public class CinemachineBlend
{
/// <summary>First camera in the blend</summary>
public ICinemachineCamera CamA;

/// <summary>Second camera in the blend</summary>
public ICinemachineCamera CamB;

/// <summary>The curve that describes the way the blend transitions over time
/// from the first camera to the second. X-axis is normalized time (0...1) over which
/// the blend takes place and Y axis is blend weight (0..1)</summary>
public AnimationCurve BlendCurve;

/// <summary>The current time relative to the start of the blend</summary>
public float TimeInBlend;

/// <summary>The current weight of the blend. This is an evaluation of the
/// BlendCurve at the current time relative to the start of the blend.
/// 0 means camA, 1 means camB.</summary>
public float BlendWeight
{
get
{
if (BlendCurve == null || BlendCurve.length < 2 || IsComplete)
return 1;
return Mathf.Clamp01(BlendCurve.Evaluate(TimeInBlend / Duration));
}
}

/// <summary>Validity test for the blend. True if either camera is defined.</summary>
public bool IsValid => ((CamA != null && CamA.IsValid) || (CamB != null && CamB.IsValid));

/// <summary>Duration in seconds of the blend.</summary>
public float Duration;

/// <summary>True if the time relative to the start of the blend is greater
/// than or equal to the blend duration</summary>
public bool IsComplete => TimeInBlend >= Duration || !IsValid;

...
}
  • BlendSourceVirtualCamera:将 CinemachineBlend 对象包装为一个虚拟相机,用于在相机混合过程中作为一个中间结果。
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
internal class BlendSourceVirtualCamera : ICinemachineCamera
{
public BlendSourceVirtualCamera(CinemachineBlend blend) { Blend = blend; }
public CinemachineBlend Blend { get; set; }

public string Name { get { return "Mid-blend"; }}
public string Description { get { return Blend == null ? "(null)" : Blend.Description; }}
public int Priority { get; set; }
public Transform LookAt { get; set; }
public Transform Follow { get; set; }
public CameraState State { get; private set; }
public GameObject VirtualCameraGameObject { get { return null; } }
public bool IsValid { get { return Blend != null && Blend.IsValid; } }
public ICinemachineCamera ParentCamera { get { return null; } }
public bool IsLiveChild(ICinemachineCamera vcam, bool dominantChildOnly = false)
{ return Blend != null && (vcam == Blend.CamA || vcam == Blend.CamB); }
public CameraState CalculateNewState(float deltaTime) { return State; }
public void UpdateCameraState(Vector3 worldUp, float deltaTime)
{
if (Blend != null)
{
Blend.UpdateCameraState(worldUp, deltaTime);
State = Blend.State;
}
}
public void InternalUpdateCameraState(Vector3 worldUp, float deltaTime) {}
public void OnTransitionFromCamera(ICinemachineCamera fromCam, Vector3 worldUp, float deltaTime) {}
public void OnTargetObjectWarped(Transform target, Vector3 positionDelta) {}
}

UpdateFrame0

  • 首先调用 TopCameraFromPriorityQueue 取出当前优先级最高的相机,作为当前帧的活跃相机 activeCamera,并从 mFrameStack 中取出上一帧的活跃相机 outGoingCamera,上一帧的活跃相机是 mFrameStack[0].blend.CamB

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    ICinemachineCamera TopCameraFromPriorityQueue()
    {
    CinemachineCore core = CinemachineCore.Instance;
    Camera outputCamera = OutputCamera;
    int mask = outputCamera == null ? ~0 : outputCamera.cullingMask;
    int numCameras = core.VirtualCameraCount;
    for (int i = 0; i < numCameras; ++i)
    {
    var cam = core.GetVirtualCamera(i);
    GameObject go = cam != null ? cam.gameObject : null;
    if (go != null && (mask & (1 << go.layer)) != 0)
    return cam;
    }
    return null;
    }

    private void UpdateFrame0(float deltaTime)
    {
    // Make sure there is a first stack frame
    if (mFrameStack.Count == 0)
    mFrameStack.Add(new BrainFrame());

    // Update the in-game frame (frame 0)
    BrainFrame frame = mFrameStack[0];

    // Are we transitioning cameras?
    var activeCamera = TopCameraFromPriorityQueue();
    var outGoingCamera = frame.blend.CamB;

    ...
    }

    要注意取出优先级最高的相机时,需要使用虚拟相机 Layer 来判断相机是否可用,如果 CinemachineBrain 所绑定的实体相机的剔除遮罩(CullMask)包含了虚拟相机的 Layer,则该虚拟相机可用

  • 判断当前帧的活跃相机和上一帧的活跃相机是否互不相同,如果不相同则更新 mFrameStack[0] 的数据,在结束的时候将当前帧的活跃相机赋值给 mFrameStack[0].blend.CamB

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    private void UpdateFrame0(float deltaTime)
    {
    ...

    if (activeCamera != outGoingCamera)
    {
    // Do we need to create a game-play blend?
    if ((UnityEngine.Object)activeCamera != null
    && (UnityEngine.Object)outGoingCamera != null && deltaTime >= 0)
    {
    ...
    }
    // Set the current active camera
    frame.blend.CamB = activeCamera;
    }

    ...
    }

  • 开始更新 mFrameStack[0] 的数据,主要是混合相关的逻辑

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    private void UpdateFrame0(float deltaTime)
    {
    ...

    if (activeCamera != outGoingCamera)
    {
    // Do we need to create a game-play blend?
    if ((UnityEngine.Object)activeCamera != null
    && (UnityEngine.Object)outGoingCamera != null && deltaTime >= 0)
    {
    // Create a blend (curve will be null if a cut)
    var blendDef = LookupBlend(outGoingCamera, activeCamera);
    float blendDuration = blendDef.BlendTime;
    float blendStartPosition = 0;
    if (blendDef.BlendCurve != null && blendDuration > UnityVectorExtensions.Epsilon)
    {
    if (frame.blend.IsComplete)
    frame.blend.CamA = outGoingCamera; // new blend
    else
    {
    // Special case: if backing out of a blend-in-progress
    // with the same blend in reverse, adjust the blend time
    // to cancel out the progress made in the opposite direction
    if ((frame.blend.CamA == activeCamera
    || (frame.blend.CamA as BlendSourceVirtualCamera)?.Blend.CamB == activeCamera)
    && frame.blend.CamB == outGoingCamera)
    {
    // How far have we blended? That is what we must undo
    var progress = frame.blendStartPosition
    + (1 - frame.blendStartPosition) * frame.blend.TimeInBlend / frame.blend.Duration;
    blendDuration *= progress;
    blendStartPosition = 1 - progress;
    }
    // Chain to existing blend
    frame.blend.CamA = new BlendSourceVirtualCamera(
    new CinemachineBlend(
    frame.blend.CamA, frame.blend.CamB,
    frame.blend.BlendCurve, frame.blend.Duration,
    frame.blend.TimeInBlend));
    }
    }
    frame.blend.BlendCurve = blendDef.BlendCurve;
    frame.blend.Duration = blendDuration;
    frame.blend.TimeInBlend = 0;
    frame.blendStartPosition = blendStartPosition;
    }
    // Set the current active camera
    frame.blend.CamB = activeCamera;
    }

    ...
    }
    • 如果 mFrameStack[0] 的混合已经计算完成,则将上一帧的活跃相机与当前帧的活跃相机之间的混合记录到 mFrameStack[0]
    • 如果 mFrameStack[0] 的混合还未计算完成,则使用 BlendSourceVirtualCamera 将尚未完成的混合存储成一个虚拟相机作为中间结果,赋值到 mFrameStack[0].blend.CamA 中成为混合的起始相机,意图是将当前帧的混合与上一帧的混合链接起来
      • 这里需要处理一种特殊情况,如果上一帧的混合还未完成,且上一帧的混合与当前帧的混合是相反的(即起始相机与目标相机调转了过来),则需要更新当前帧混合的相关数据来处理平滑过渡

        比如上一帧的混合需要从 相机A 过渡到 相机B,持续时间为 C,且已经进行了 20%

        则当前帧的混合需要从 相机B 过渡到 相机A,原本的持续时间为 DD 需要更新为 **D * 20%**,且直接从混合的 80% 进度开始计算

        可以理解成上一帧的混合直接取消掉,已经进行的混合计算被放到当前帧的混合里

  • 最后则是推进混合的进度

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    private void UpdateFrame0(float deltaTime)
    {
    ...

    // Advance the current blend (if any)
    if (frame.blend.CamA != null)
    {
    frame.blend.TimeInBlend += (deltaTime >= 0) ? deltaTime : frame.blend.Duration;
    if (frame.blend.IsComplete)
    {
    // No more blend
    frame.blend.CamA = null;
    frame.blend.BlendCurve = null;
    frame.blend.Duration = 0;
    frame.blend.TimeInBlend = 0;
    }
    }
    }

ComputeCurrentBlend

ComputeCurrentBlend 的逻辑较为简单,主要就是将 mFrameStack.frame.blend 的属性赋值到 mFrameStack.frame.workingBlend 中,再赋值到 outputBlend 中输出,最终存储到 CinemachineBrain.mCurrentLiveCameras

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
public void ManualUpdate()
{
m_LastFrameUpdated = Time.frameCount;

float deltaTime = GetEffectiveDeltaTime(false);
if (!Application.isPlaying || m_BlendUpdateMethod != BrainUpdateMethod.FixedUpdate)
UpdateFrame0(deltaTime);

ComputeCurrentBlend(ref mCurrentLiveCameras, 0);

...
}

public void ComputeCurrentBlend(
ref CinemachineBlend outputBlend, int numTopLayersToExclude)
{
// Make sure there is a first stack frame
if (mFrameStack.Count == 0)
mFrameStack.Add(new BrainFrame());

// Resolve the current working frame states in the stack
int lastActive = 0;
int topLayer = Mathf.Max(1, mFrameStack.Count - numTopLayersToExclude);
for (int i = 0; i < topLayer; ++i)
{
BrainFrame frame = mFrameStack[i];
if (i == 0 || frame.Active)
{
frame.workingBlend.CamA = frame.blend.CamA;
frame.workingBlend.CamB = frame.blend.CamB;
frame.workingBlend.BlendCurve = frame.blend.BlendCurve;
frame.workingBlend.Duration = frame.blend.Duration;
frame.workingBlend.TimeInBlend = frame.blend.TimeInBlend;
if (i > 0 && !frame.blend.IsComplete) // 这一段逻辑几乎不进入,因为 i 绝大多数情况下都为 0
{
if (frame.workingBlend.CamA == null)
{
if (mFrameStack[lastActive].blend.IsComplete)
frame.workingBlend.CamA = mFrameStack[lastActive].blend.CamB;
else
{
frame.workingBlendSource.Blend = mFrameStack[lastActive].workingBlend;
frame.workingBlend.CamA = frame.workingBlendSource;
}
}
else if (frame.workingBlend.CamB == null)
{
if (mFrameStack[lastActive].blend.IsComplete)
frame.workingBlend.CamB = mFrameStack[lastActive].blend.CamB;
else
{
frame.workingBlendSource.Blend = mFrameStack[lastActive].workingBlend;
frame.workingBlend.CamB = frame.workingBlendSource;
}
}
}
lastActive = i;
}
}
var workingBlend = mFrameStack[lastActive].workingBlend;
outputBlend.CamA = workingBlend.CamA;
outputBlend.CamB = workingBlend.CamB;
outputBlend.BlendCurve = workingBlend.BlendCurve;
outputBlend.Duration = workingBlend.Duration;
outputBlend.TimeInBlend = workingBlend.TimeInBlend;
}

更新虚拟相机状态

CinemachineBrainUpdateMethod.png
更新虚拟相机状态使用的函数为 CinemachineBrain.UpdateVirtualCameras,会根据 CinemachineBrainUpdate Method 的值(如上图)从 CinemachineBrain.ManualUpdate (LateUpdate / SmartUpdate)或 CinemachineBrain.AfterPhysics(FixedUpdate / SmartUpdate)中调用

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
public void ManualUpdate()
{
...
if (m_UpdateMethod == UpdateMethod.FixedUpdate)
{
...
}
else
{
CinemachineCore.UpdateFilter filter = CinemachineCore.UpdateFilter.Late;
if (m_UpdateMethod == UpdateMethod.SmartUpdate)
{
// Track the targets
UpdateTracker.OnUpdate(UpdateTracker.UpdateClock.Late);
filter = CinemachineCore.UpdateFilter.SmartLate;
}
UpdateVirtualCameras(filter, deltaTime);
}
...
}
WaitForFixedUpdate mWaitForFixedUpdate = new WaitForFixedUpdate();
private IEnumerator AfterPhysics()
{
while (true)
{
// FixedUpdate can be called multiple times per frame
yield return mWaitForFixedUpdate;
if (m_UpdateMethod == UpdateMethod.FixedUpdate
|| m_UpdateMethod == UpdateMethod.SmartUpdate)
{
CinemachineCore.UpdateFilter filter = CinemachineCore.UpdateFilter.Fixed;
if (m_UpdateMethod == UpdateMethod.SmartUpdate)
{
// Track the targets
UpdateTracker.OnUpdate(UpdateTracker.UpdateClock.Fixed);
filter = CinemachineCore.UpdateFilter.SmartFixed;
}
UpdateVirtualCameras(filter, GetEffectiveDeltaTime(true));
}
...
}
}

可以看到当 Update MethodSmartUpdate 时都有一个 UpdateTracker.OnUpdate 的调用

UpdateVirtualCameras

1
2
3
4
5
6
7
8
9
10
11
private void UpdateVirtualCameras(CinemachineCore.UpdateFilter updateFilter, float deltaTime)
{
// We always update all active virtual cameras
CinemachineCore.Instance.m_CurrentUpdateFilter = updateFilter;
Camera camera = OutputCamera;
CinemachineCore.Instance.UpdateAllActiveVirtualCameras(
camera == null ? -1 : camera.cullingMask, DefaultWorldUp, deltaTime);
...
mCurrentLiveCameras.UpdateCameraState(DefaultWorldUp, deltaTime);
...
}

根据上述代码可知 UpdateVirtualCameras 函数的主要干了如下两件事情:

  • 调用 CinemachineCore.Instance.UpdateAllActiveVirtualCameras // 更新待机相机和激活相机的状态
    • CinemachineCore.Instance.UpdateVirtualCamera // 额外做了处理,避免一帧内多次更新同一个相机的状态
    • CinemachineVirtualCameraBase.InternalUpdateCameraState // 实际更新相机状态的逻辑
  • 调用 mCurrentLiveCameras.UpdateCameraState // 更新激活相机的状态
    • CinemachineCore.Instance.UpdateVirtualCamera // 额外做了处理,避免一帧内多次更新同一个相机的状态
    • CinemachineVirtualCameraBase.InternalUpdateCameraState // 实际更新相机状态的逻辑

可以看到两件事情都更新了激活相机的状态,所以在 CinemachineCore.Instance.UpdateVirtualCamera 中做了额外的处理,来避免一帧内多次更新同一个相机的状态

UpdateAllActiveVirtualCameras

每帧调用 UpdateAllActiveVirtualCameras 来对合适的虚拟相机进行状态更新,合适的虚拟相机包括:激活的虚拟相机、待机的虚拟相机
待机相机的更新时机由 CinemachineVirtualCameraStandby Update 来决定处于待机状态下的更新时机

  • Round Robin:轮询更新,CinemachineCore 每帧挑选一个待机的虚拟相机且Standby UpdateRound Robin 进行更新
  • Always:该虚拟相机处于待机状态时也会每帧更新状态
  • Never:该虚拟相机处于待机状态时不更新状态

CinemachineVritualCameraStandbyUpdate.png

大致的流程如下: