Game Creator 2 Integration
The integration can be found in Assets/Plugins/CrystalLipSync/Integration Packages/
CrystalLipSync integrates with Game Creator 2 as IK rigs and Visual Scripting instructions. The GC2 integration files are in the GC2 Integration/ folder and require Game Creator 2 to be installed.

IK Rigs
RigCrystalLipSync
Add to a Character's IK rig list to auto-provision the entire lip sync pipeline:
AudioSource (3D spatial,
playOnAwake = false)CrystalLipSyncController (wired to the AudioSource)
CrystalLipSyncBlendshapeTarget (on the best mesh, auto-mapped)
CrystalTextLipSync (optional, when "Add Text Lip Sync" is enabled)
CrystalMicrophoneLipSync (optional, when "Add Microphone Lip Sync" is enabled)
The rig performs no actual IK bone manipulation ... it uses the GC2 IK lifecycle purely for automatic setup. When Auto Provision is enabled, it creates components automatically at runtime. On model change, it re-scans for the target mesh.
Enable the Add Text Lip Sync toggle in the rig inspector to also provision a CrystalTextLipSync component for dialogue-driven mouth animation without voice-over audio.
Enable the Add Microphone Lip Sync toggle to provision a CrystalMicrophoneLipSync component for live microphone input (VR, voice chat, etc.).
RigCrystalEyeBlink

Add to a Character's IK rig list for natural eye blinking. Includes all the same settings as the standalone CrystalEyeBlink component. The inspector provides blendshape dropdown selectors to remap detected indices.
Visual Scripting Instructions
All instructions are in the CrystalLipSync category in Game Creator 2's instruction picker.
Play Lip Sync Speech
Plays an AudioClip on the target's AudioSource. If a CrystalLipSyncController exists, its referenced AudioSource is used. Supports Wait To Complete to pause the sequence until playback finishes.
Stop Lip Sync Speech
Stops the currently playing speech and clears the clip.
Set Lip Sync Mood
Changes the active mood on a Controller (Neutral, Happy, Angry, Sad).
Set Lip Sync Audio Source
Swaps the AudioSource that a Controller analyzes at runtime.
Active Eye Blink
Enables or disables the RigCrystalEyeBlink IK rig on a character.
Play Text Lip Sync
Plays text-driven lip sync on a target with a given text string and characters-per-second speed. Supports Wait To Complete.
Stop Text Lip Sync
Stops the currently playing text-driven lip sync on the target. The mouth smoothly returns to rest.
Toggle Microphone Lip Sync
Starts or stops real-time microphone capture on a target's CrystalMicrophoneLipSync component.
Why "Play Lip Sync Speech" instead of GC2's built-in audio?

GC2's AudioManager creates pooled AudioSource objects on hidden child GameObjects under the AudioManager transform. Unity's AudioSource.GetSpectrumData() can only read from the exact AudioSource that is playing the audio. Since CrystalLipSync's analyzer reads FFT data from the controller's referenced AudioSource, the audio must play on that same source ... not on a pooled one managed by GC2's audio system.
The Play Lip Sync Speech instruction handles this automatically by playing directly on the controller's AudioSource.
GC2 Properties
CrystalLipSync adds custom properties to GC2's polymorphic property system. These appear in the property picker dropdowns throughout Game Creator 2.
Play Clip On Source - GetAudioSource Property
Category: Play Clip on Source
Plays an AudioClip on an AudioSource found on a target GameObject. Uses GC2's polymorphic picker so you can select the target as Player, Self, Game Object, etc.

Current Dialogue Text (String Property)
Category: Dialogue / Current Dialogue Text
Returns the text of the dialogue line currently being displayed. Returns an empty string when no dialogue is active. Useful for displaying the current line in custom UI, storing it in a variable, or feeding it to other systems.
Text To Lipsync
There are two ways to integrate text lip sync with GC2 Dialogue:
Option A: Actor Gibberish Property (Recommended)

The simplest approach ... no extra scene components needed. On your Actor ScriptableObject:
Open the Typewriter Effect foldout.
Set the Gibberish field to CrystalLipSync > Text Lip Sync.
Done. Every time this Actor speaks, their mouth animates from the dialogue text.
The property resolves the speaker automatically from the Dialogue's Roles assignments. It supports:
Speaker Target (Optional)
Leave empty in most cases ... the speaker is resolved automatically. Only set this if the Actor has no role target, or if you reuse the same Actor across Dialogues with different targets.
Optional Gibberish Audio
Use GC2's polymorphic picker to add gibberish sounds alongside the mouth animation. Select Default Gibberish for GC2's built-in blip, assign a custom clip, or leave as None for silent lip sync.
Option B: CrystalDialogueLipSync Component

For scene-wide automatic handling without modifying Actor assets, add the GC2 Dialogue Text Lip Sync (CrystalDialogueLipSync) component to any persistent GameObject in your scene (e.g. a manager).
Add Component Menu: CrystalLipSync / GC2 Dialogue Text Lip Sync
This component:
Automatically detects when a GC2 Dialogue starts.
Reads each dialogue node's text and the speaking Actor.
Resolves the Actor to a scene GameObject via the Dialogue's role assignments.
Finds the
CrystalTextLipSynccomponent on the speaker.Feeds the text at the Actor's typewriter speed (characters per second).
Stops text lip sync when the node finishes.
Skips nodes with voice-over AudioClips (audio lip sync handles those instead).
Fallback Chars/Sec
Used when the Actor has no typewriter or typewriter is disabled.
10
Skip When Audio Present
When enabled, text lip sync is skipped for nodes that have a voice-over AudioClip assigned.
true
Show Debug Logs
Logs which speaker and text is being processed.
false
Note: You don't need both Option A and Option B for the same Actor. Choose one.
Setup requirements for each speaking character
A
CrystalLipSyncControlleron the character root.A
CrystalLipSyncBlendshapeTargeton the face mesh (auto-mapped).A
CrystalTextLipSyncon the character root (wired to the controller).The character must be assigned as the Actor's target in the Dialogue component's Roles section.
Tip: The Setup Wizard and the IK rig both handle steps 1...3 automatically when you enable "Add Text Lip Sync".
Last updated
