tks_yoshinagaの日記

KinectやLeap motion, VRやARなどの技術を使ってやったことのメモとか

Manipulating Nearby Objects in MetaQuest

0. Contents of This Article

This article introduces how to grab and manipulate nearby 3D objects (cubes) in Meta Quest. This video is an AR version, but it is compatible with both AR and VR. You can also confirm the operation in the 02-NearManipulation or 02-NearManipulation-AR scene of the sample available on GitHub.

 

Please note that this article assumes that the preparations described in the following have been completed.

[Preparation]

 

 

1. Setting to Grab Nearby Objects with Controllers or Hands

Here, we introduce scripts to be added and how to set them for grabbing objects with controllers or hands. By setting both, you can use both controllers and hands.

[Grab with Controllers]

  • Select the Cube in the Inspector and click Add Component
  • Search for Grab Interactable and select Grab Interactable from the candidates
  • In the added Grab Interactable, drag and drop Cube to Pointable Element
    * Connect Grab Interactable and Grabbalbe, and reflect the behavior set by Grabbable using the information grabbed by the controller
  • Furthermore, drag and drop Cube to Rigidbody in Grab Interactable
    * Utilize the contact detection between the controller and the Cube by assigning the Rigidbody added to Cube at the beginning

 

[Grab with Hands]

  • Select the Cube in the Inspector and click Add Component
  • Search for Grab Interactable and select Hand Grab Interactable from the candidates
  • In the added Hand Grab Interactable, drag and drop Cube to Pointable Element
    * Connect Hand Grab Interactable and Grabbalbe, and reflect the behavior set by Grabbable using the information grabbed by the hands
  • Furthermore, drag and drop Cube to Rigidbody in Hand Grab Interactable
    * Utilize the contact detection between the hands and the Cube by assigning the Rigidbody added to Cube at the beginning

 

[Attention!]

If ControllerOnly is selected for Hand Tracking Support in OVR Manager attached to OVRCameraRig object, hand tracking will not work and you will not be able to grab objects with your hands. Be sure to check your settings.

 

2. Fine-tuning Controller Settings (Optional)

By default, you can grab and move objects with controllers. At this time, the default setting uses the grip button (middle finger) to grab. Here's how to make it possible to grab with, for example, the trigger button.

  • Select the child element OVRCameraRig of OVRCameraRigInteraction in the Hierarchy
  • Open OVRInteractionComprehensive -> OVRControllers under it
  • Make sure there are LeftController and RightController as child elements of OVRControllers
  • If you want to configure the left controller, open LeftController and follow these steps:
    ControllerInteractors -> ControllerGrabInteractor -> GripButtonSelector
  • Focus on ControllerSelector in GripButtonSelector of the Inspector
  • Turn on the TriggerButton checkbox in Controller Button Usage
    * If GripButton is not required, uncheck GripButton
  • By performing similar operations for the right-hand controller, you can set the button to be used when grabbing

 

3. Fine-tuning Hand Behavior (Optional)

It's a minor point, but by default, when you scale objects with both hands, the hand object does not follow the actual hand. If you want the hand object to follow, try the following:

  • Select Cube in the Hierarchy
  • Focus on Hand Grab Interactable registered with Cube from the Inspector
  • Change the setting of Hand Alignment to None

 

4. Using Remote Object Manipulation in Conjunction with Near Manipulation

You can use remote object manipulation in conjunction with nearby object manipulation for the same object. While this article introduced how to manipulate nearby objects, if you want to add manipulation of distant objects, please follow the steps from Chapter 1 of the following article.



5. List of Articles on Meta XR SDK