text
stringlengths
0
17.2k
Code Example:
UPROPERTY(Category=Links) FPoseLink Base; UPROPERTY(Category=Links) FPoseLink Additive;
Code Example:
UPROPERTY(Category=Settings, meta(PinShownByDefault)) mutable float Alpha;
Code Example:
UPROPERTY(Category=Settings) FAnimNode_ApplyAdditive Node;
Code Example:
FLinearColor UAnimGraphNode_ApplyAdditive::GetNodeTitleColor() const { return FLinearColor(0.75f, 0.75f, 0.75f); } FString UAnimGraphNode_ApplyAdditive::GetNodeTitle(ENodeTitleType::Type TitleType) const { return TEXT("Apply Additive"); }
Code Example:
FString UAnimGraphNode_ApplyAdditive::GetTooltip const { return TEXT("Apply additive animation to normal pose"); }
Code Example:
void UAnimGraphNode_LayeredBoneBlend::GetContextMenuActions(const FGraphNodeContextMenuBuilder& Context) const { if (!Context.bIsDebugging) { if (Context.Pin != NULL) { // we only do this for normal BlendList/BlendList by enum, BlendList by Bool doesn't support add/remove pins if (Context.Pin->Direction == EGPD_Input) { //@TODO: Only offer this option on arrayed pins Context.MenuBuilder->BeginSection("AnimNodesLayeredBoneBlend", NSLOCTEXT("A3Nodes", "LayeredBoneBlend", "Layered Bone Blend")); { Context.MenuBuilder->AddMenuEntry(FGraphEditorCommands::Get().RemoveBlendListPin); } Context.MenuBuilder->EndSection(); } } else { Context.MenuBuilder->BeginSection("AnimNodesLayeredBoneBlend", NSLOCTEXT("A3Nodes", "LayeredBoneBlend", "Layered Bone Blend")); { Context.MenuBuilder->AddMenuEntry(FGraphEditorCommands::Get().AddBlendListPin); } Context.MenuBuilder->EndSection(); } } }
Remarks
You can also use helper functions to get the actual nodes in your getters. These exist on theUAnimInstance:
Add a Custom Object Type to Your Project
There will be times where 6 Object Response Channels and 2 Trace Response Channels simply are not granular enough for what you want to create. This is what the Collision Editor in your Project Settings is for. You can access them from the Edit Menu -> Project Settings -> Collision:
(w:400)
From here you can add new Object Response Channels and Trace Response Channels. Click the New Object Channel... or New Trace Channel... button, provide a name, select a Default Response, and click Accept.
You can have up to 18 Custom Object Response Channels or Custom Trace Response Channels.
Custom Presets can also be set here by expanding the Preset Category and clicking the New... button.
Presets
Remarks
From here you can name your preset, enable or disable collision, select your preset's Object Type, and finally define the behavior of each Response Channel for the selected Object Type.
AI Debugging
Once you've created an AI entity you can diagnose problems or simply view what an AI is doing at any given moment using the AI Debugging Tools. Once enabled, you can cycle between viewing Behavior Trees, the Environment Query System (EQS), and the AI Perception systems all within the same centralized location.
To make the most of the AI Debugging tools, you will need a Pawn with an AI Controller in your Level that is running a Behavior Tree or has an AI Perception component. When your AI is executing an EQS, it will be reflected inside the AI Debugging tool when it is running.
To enable AI Debugging, while your game is running, press the ' (apostrophe) key.
The following options are available while the AI Debugging Tools are enabled:
Toggles the AI information that is being displayed:
The Numpad values above and their associated debug information are for the default debugger. These values can be dynamically incremented from 0-9 on a project basis based on your project needs.
When the AI Debugging tools are enabled, pressing Numpad 0 will toggle the display of the possible locations the AI can currently navigate to from its current position using the Nav Mesh Bounds Volume (if one is placed in the Level).
You can also toggle the display of the Nav Mesh during gameplay with the console command show Navigation true (to display the Nav Mesh) or show Navigation false (to hide the Nav Mesh).
With the AI Debugging tools enabled, pressing Numpad 1 will display the general AI debug information:
The AI category within the AI Debugging tools displays general information about the AI such as:
In addition to the options above, above the Pawn in the Level, you will see the assigned AI Controller Class and Pawn Class (also displayed in the upper-right corner of the Viewport).
When the AI Debugging tools are enabled, pressing Numpad 2 will toggle the display of the Behavior Tree information. -t
Behavior Tree debug information is split into two categories: the Behavior Tree information (left) and the Blackboard information (right). The Behavior Tree information displays the class of Behavior Tree being used and which branch of the tree is currently being executed (along with the nodes within that branch). The Blackboard debug information will display the Blackboard asset being used along with any Blackboard keys and their current values (which can be useful in determining why the AI is or is not performing an action based on the value of a key).
Inside the Behavior Tree asset, you can also add Breakpoints similar to normal Blueprints to pause the execution of the script when reaching a given node. This can help you diagnose what is occurring at any given time within your behaviors.
You can display information about active Environmental Queries by pressing Numpad 3 when the AI Debugging tools are enabled.
The Visual Logger records EQS data which can also be referred to. Please see the linked page for more details.
The EQS debug information will display the current Environmental Query that is being run along with the Generator used. In the example above, we are using a Simple Grid to determine the best possible location that provides a line of sight to the Player that is nearest the enemy AI character. For this example, we are also presented with the points on our grid (represented by spheres).
The green spheres are locations that pass our Test (has line of sight to the Player) while the blue spheres represent locations that fail our Test (does not have line of sight to the Player). Each sphere is weighted with a numerical value, and our highest weighted value is designated as the "winner" and location that the AI chooses to move to.
You can also press the / (divide) key to display a detailed table breakdown that shows the results of your Tests.
In the example image above, the gray float numbers in the right-most column are the distance in cm and the green ones are the normalized values (from what has been specified in the Test).
In addition to using the AI Debugging tools, EQS provides a way to debug queries while your game is not running by using a special type of Pawn. Please see EQS Testing Pawn for more information.
At runtime with the AI Debugging tools enabled, pressing 4 on the Numpad key will display Perception System information.
Above, we have an AI character that is set up for Sight (indicated by the green debug lines drawn from the character's head). In the image below, when the AI character sees our Player (which is a source of stimuli for Sight), that location is represented by a green sphere as the Last Known Location.
Any Senses that have been defined on the AI Perception Component under Senses Config will be displayed in the debug window.
Above, we are debugging both Sight (green) and Hearing (yellow) Senses.
Enabling AI Debugging