text
stringlengths
0
17.2k
This is required to listen for a generated location event (created by theGenerate Location Eventmodule in the Particle Update group). The Receive Location Event has the following settings:
Animation Pose Snapshot
When animating Skeletal Meshes, there may be instances when you want to let physics take control of the mesh (such as a character entering a rag-doll state). After physics has been applied, using the Animation Pose Snapshot feature, you can capture a Skeletal Mesh pose (storing all the Bone Transform data) within a Blueprint. You can then retrieve that informaiton inside an Animation Blueprint and use the saved pose as a source to blend from (as seen in the example video below).
Above, our character enters a rag-doll state when we press the R key and we use the Pose Snapshot node in our Character Blueprint to save off the pose of our Skeletal Mesh. When we press the R key again, our character blends from that snapshot into and plays a "get up" Animation Montage before resuming the normal locomotion state. This grants us the ability to take whatever pose the character ends up in as a result of physics and generate a smooth blend from that pose into an animation of the character getting back to their feet.
In order to save your Skeletal Mesh's pose at runtime, inside your Character Blueprint, you will need access to the Skeletal Mesh Component and its Anim Instance. After getting the Skeletal Mesh and Anim Instance, you can call the Save Pose Snapshot node and enter the desired Snapshot Name. You can manually enter a name into the Snapshot Name field or create a variable to store the name.
The name you provide as the Snapshot Name must also be used when attempting to retrieve the Pose Snapshot inside your Animation Blueprint. Additionally, when calling Save Pose Snapshot, the snapshot is taken at the current LOD. For example, if you took the snapshot at LOD1 and then used it at LOD0, any bones not in LOD1 will use the mesh's reference pose.
To retrieve a Pose Snapshot, inside the AnimGraph of your Animation Blueprint, right-click and add the Pose Snapshot node and enter your Snapshot Name.
Below is the graph used for the example of a character getting up from a rag-doll pose.
Above, we have a State Machine called Default that drives our character's locomotion and uses a Slot node called MySlot that plays an Animation Montage of our character getting up. We use the Blend Poses by bool node to determine if our character has stopped moving, where if True we switch over to our Pose Snapshot. If False, we blend from the Pose Snapshot into our slotted Animation Montage before continuing on with our normal Default State Machine.
An alternative way of using the Pose Snapshot feature is by calling the Snapshot Pose function within Blueprint to save a snapshot to a Pose Snapshot variable.
When using Snapshot Pose, you will need to provide a variable to save the snapshot to as shown above.
Inside your AnimGraph after adding the Pose Snapshot node, set the Mode to Snapshot Pin and check the (As pin) Snapshot option inside the Details panel.
This will expose a Pose Snapshot input pin on the node where you can pass in your desired snapshot variable.
Save Pose Snapshot
Retrieve Pose Snapshot
Snapshot Pose Function
Additional Resource
Remarks
Additional information about this feature can be seen in the following Live Training Stream.
AJA Media Reference
This page describes the options and settings exposed on AJA Media Framework objects.
The AJA Media Source and AJA Media Output have been tested with the following cards, using Version 15.2 or later of the AJA Desktop Software:
Corvid 88
Corvid 44 12G
Corvid 44 12G BNC
Corvid 44
Corvid 44 BNC
KONA 4 (4K bitfile is supported, UFC bitfile is not supported)
KONA 5 (Both KONA 5 (4K) and KONA 5 (8K) bitfiles are supported)
KONA HDMI
Io 4K Plus
Other devices and SDK versions may or may not work as expected.
Each AJA Media Source object that you create exposes the following configuration settings.
Sets whether the incoming video feed is progressive or interlaced. Note that this must match the actual video feed exactly.
Captures video, audio, and ancillary data at the same time. This may decrease your transfer performance, but it guarantees that all data for each frame will be synchronized together. If you experience problems with latency, you can try disabling this option.
Determines the order of the color channels that make up each pixel in the input video, and the number of bits in each channel.
When enabled, the engine embeds the timecode of each frame into the captured video. You can use this to check that the timecode for each frame of input matches the values you're expecting. See Timecode Texel Encoding.
Each AJA Media Output object that you create exposes the following configuration settings.
Determines whether the Unreal Engine outputs only the fill image, or both the fill and key images. When you set this to Fill Only, only the fill image is output to the Source set below. When you set this to Fill and Key, the fill image is output to the Source, and the key is output to the Key Source.
Sets the resolution of the video feed produced by this Media Output.
Configures the source of the timing for the internal clock on the AJA card. The card uses this to determine when it should send each frame of video output.
Free Run - Uses the card's internal clock.
External - Synchronizes the card's internal clock with the genlock signal that arrives on its reference pin from an external source.
Input - Synchronizes with the video signal from the input port that you specify in the Sync Source setting below.
When enabled, the Unreal Engine buffers its output frames before sending them to the AJA card. This may improve the smoothness of the video signal, at the cost of some latency. Leave this option disabled to minimize latency, at the risk of seeing interruptions in the output signal.
Determines the order of the color channels that make up each pixel, and the number of bits in each channel.
If you want to output the alpha, set the Output Type setting to Fill and Key, and use the Key Source to send the alpha to an output port on your AJA card.
Sets the number of buffers used to transfer each frame image from the main thread memory to the AJA card. Lower values are more likely to cause missed frames as it waits for each transfer to complete; larger numbers are more likely to increase latency.
Sets the number of buffers used to transfer each frame image from the GPU to main thread memory. Lower values are more likely to cause a bottleneck on the GPU side as it waits for each transfer to complete; larger numbers are more likely to increase latency.
When this option is disabled, and you don't already have the Unreal Engine genlocked to an input signal, the engine runs at the fastest frame rate it can manage and provides all the frames it generates to the AJA card. Each time the card is ready to output a new frame, it selects one of the frames generated by the Engine.