Parameter Settings

Updated: 02/19/2024

You can adjust camera tracking items and Live2D parameters in nizima LIVE.
nizima LIVE allows for the adjustment of detailed aspects such as the timing and linking of parameter transitions, which have been manipulated in Live2D Cubism up to now.

Upper tab menu

FunctionOperation
Motion settingsFine-tune how the parameters are applied to the Live2D model and the mapping between tracking items and Live2D parameters.
Motion settings beta (ARKit)Configure settings related to ARKit tracking items used in nizima LIVE TRACKER.
Motion sensitivityAdjust how much of the movement read by the camera is applied to the tracking item.
PhysicsAdjust the motion scale for Live2D parameters for which physics are set.
Automatic playbackSet Live2D parameters that are played back automatically and repetitively, such as breathing.
Other settingsSet the alignment of left and right eye blinks and eyebrow movements and configure lip-sync settings.
Import and save parameter settings configured in nizima LIVE.

Motion settings

Parameter selection panel (left side)

The following is the list of parameters that can be tracked in nizima LIVE and the Live2D parameters associated with them.

Selecting a tracking item changes the advanced settings panel on the right.

Press ▲ to the right of the tracking item to close or open the associated UI.

nizima LIVE automatically links Live2D parameters to tracking items.
Each default parameter that is linked is displayed under the tracking item.

Each parameter is indicated by the set name if the model data contains cdi.json, otherwise it is indicated by ID.
The + button allows you to add a new Live2D parameter to be linked to the tracking item.
The X button allows you to unbind the tracking item from the Live2D parameter.

Tracking items dedicated to iOS are also displayed, with or without connection to the mobile version.

Advanced settings panel (right side)

Parameter ID
Displays the ID of the parameter that the Live2D model has.
Motion scale
You can set the scale for body movements to be applied to the movements of the Live2D model in the range of 0% to 200%.
Smoothing

This integrates and compensates for minor body shaking, etc., and allows the user to set the smoothness of the Live2D model’s movement in the range of 0% to 100%.

Reverse playback
Reverses the movement of the Live2D model in response to the body’s movement.
Curve editor

You can edit the waveform to see how the Live2D model’s movement changes in response to the body’s movements.

The selected point is displayed in purple. The green point on the waveform is the current value.

Click “…” in the upper-right corner of the curve editor to display the various operation buttons.

OperationContents
ClickMarks a point.
Right-clickDeletes a point.
Copy the entire curve
or Ctrl + C
Copies the shape of the entire curve.
Paste the entire curve
or Ctrl + V
Pastes the entire curve shape that you have copied.
Undo
or Ctrl + Z
Undoes the operation.
Redo
or Ctrl + Y or Ctrl + Shift + Z
Redoes the undone operation.
Show/hide default valuesShows/hides the default values of the parameters that the Live2D model has.
Enable/disable snappingToggles whether or not the point will snap to the grid, edge of the screen, etc.
ResetInitializes the edits and returns the curve shape to its initial value.
Arrow keysUse the up, down, left, and right arrows to move points.
Tips

Linear, Step, Invert Step, and Bezier at the bottom of the screen allow you to choose the type of connection to the point to the right of the selected point.

Linear: A straight line connects the selected point and the point to the right of it. Animation is constant and variable, with linear motion.
Step: The selected point and the point to the right of it are connected by a horizontal straight line that matches the value of the selected point. The animation moves so that the previous and next pictures change in an instant.
Invert Step: The selected point and the point to the right of it are connected by a horizontal line that matches the value of the point to the right of the selected point. The animation moves so that the previous and next pictures change in an instant.
Bezier: When selected, a purple [Bezier Handle] will appear. By grabbing the rounded edge, you can move it up, down, left, or right, and freely edit the shape of the curve.

Motion settings beta (ARKit)

You can set up the tracking of movements that can be taken with an iPhone (ARKit).
The view and operation of the screen is the same as for motion settings.

Auto link function
If the tracking item name in ARKit prefixed with “Param” and the parameter ID of the Live2D model match, the tracking item and parameter are automatically linked.

Example

The tracking item “EyeBlinkLeft” is automatically linked to the Cubism parameter ID “ParamEyeBlinkLeft.” Cubism parameter IDs are case-insensitive when linked.
Either of the following is acceptable.

  • ParamTongueOut
  • ParamtongueOut

Motion sensitivity

The value of TRACKING RANGE INPUT refers to the tracking parameter.

Tips

It is easier to understand motion sensitivity if you adjust the Min and Max values while actually moving the model.

Physics

The parameters set for physics are displayed on the left side.

Parameter ID
Displays the ID of the parameter that the Live2D model has.
Motion scale
You can set the scale for body movements to be applied to the movements of the Live2D model in the range of 0% to 200%.
The closer to 200%, the greater the movement of the Live2D model in response to the body’s movements. 

Automatic playback

Parameter selection panel (left side)

+ button
You can add new Live2D parameters for which you want to set up automatic playback.

Advanced settings panel (right side)

Parameter ID
Displays the ID of the parameter that the Live2D model has.

Repeat type

Loop playback
Note: This is the recommended way to add breathing to a model.
Repeated movement back and forth between maximum and minimum.
Repeat playback
When the maximum is exceeded, the movement starts from the minimum and repeats.

Playback speed
The speed from minimum to maximum can be specified in milliseconds.

[…] in the upper-right corner > [Save] or [Save As]
You can save your settings in .live.json format.

Other settings

Set the alignment of left and right eye blinks and eyebrow movements and configure lip-sync settings.

Eye parts

Align left and right eye blinks and eyebrows
Enables or disables the synchronization function.
Synchronization ratio
Sets the criteria for winking and other out-of-sync situations.

Lip-sync

This function allows Live2D models to move in response to audio input from a microphone.
It is mainly used to move the mouth of the Live2D model.

Precautions

Only Cubism motion-sync is available for the Mac version.
Even if you enable this function, motion-sync is not available for models that do not support motion-sync.

Enable lip-sync
Enables/disables the lip-sync function.
When it is disabled, all the following UI items are also disabled.
Enable motion-sync set in Live2D Cubism Editor
The motion-sync settings configured in Live2D Cubism Editor during model creation are used to move the model. This feature is only available on models with a motion-sync setting.
Apply lip-sync to “Mouth open/close” and “Mouth width,” Time until mouth closes, and Sensitivity adjustment, explained below, are not supported for motion-sync.
Click here for details on motion-sync settings in Live2D Cubism Editor.
Apply lip-sync to “Mouth open/close” and “Mouth width”
Overwrites the existing mouth opening/closing with lip-sync.
Specifically, move the parameters linked to “13. Mouth open/close” and “14. Mouth width” in “Motion settings” with “16. Lip-sync (vertical)” and “17. Lip-sync (horizontal).” When lip-sync is enabled for the first time, this check box is automatically selected.
Input device
Switch the microphone.
Input boost
Amplify the microphone input.
Input threshold
Set noise removal. Sounds less than or equal to the threshold are not applied to lip-syncing.
0 dB: All sounds are eliminated.
Use camera when input is less than or equal to the threshold
When the sound is less than or equal to the threshold, lip-syncing is switched to mouth opening/closing captured with the camera.
This allows for mouth expressions such as surprise and yawning.
Time until mouth closes
Set the time from the end of speaking until the mouth closes.
Sensitivity adjustment, Set automatically
Sets the scale for each vowel.
Follow the guidance to set these scales automatically.
Input value scale
Manually set the scale for each vowel.
Lip-sync output confirmation

Output results can be checked for (A), (I), (U), (E), (O), (Vertical), and (Horizontal).

Was this article helpful?
YesNo
Please let us know what you think about this article.