cancel
Showing results for 
Search instead for 
Did you mean: 

[Integration Notice] Avatar Grab Unity sample available

mfmf
Oculus Staff
Hey folks: we've put together a sample of using the Avatar SDK with a few scripts to add object grab/throw functionality. I've attached it here. It requires the Oculus Utilities for Unity, and the Avatar SDK. 

This isn't the end of our hand samples, but I didn't want to leave you without at least a basic grab sample for too long. 

Known issue: it works in 5.5.0f3, but crashes in all 5.4.x versions of Unity I've tested so far. For now it's a 5.5-only sample. It doesn't rely on any 5.5-only functionality, however, so you should be able to use the scripts in 5.4 without a problem.

EDIT: updated to behave well with locomotion such as OVRPlayerController, 1/3/2017.
53 REPLIES 53

ColdSpike
Explorer

dvir said:

@clint205: the best place for that would probably be in the Grabbable script, in GrabBegin and GrabEnd. You'll want to get a reference to the LocalAvatar's OvrAvatar component at that point, and then set avatar.RightHandCustomPose to the transform you made through this example: https://developer3.oculus.com/documentation/avatarsdk/latest/concepts/avatars-sdk-unity/#avatars-sdk...

For a really, really quick hack, drag the GameObject prefab you made from that tutorial to the Resources folder, call it for example "customHandPose", and then put this at the end of your Grabbable.GrabBegin call:

  GameObject.FindObjectOfType<OvrAvatar>().RightHandCustomPose = ((GameObject)Resources.Load("customHandPose")).transform;

Let me know if that is still not clear 🙂


So is there an easier way to make custom poses? First off you obviously can't do that unless the scene is running, and I couldn't get it to work right because VR is enabled so I could barely see what I was rotating and not rotating + the shader  for the hands wouldn't render unless I move them about, which again makes it harder to tell what I'm animating.

ColdSpike
Explorer
To anyone who approves comments, I didn't know they needed to be approved and the message which tells you they do only flashes for a split second so... sorry for posting so many times XD

mfmf
Oculus Staff
NOTE: updated attached package to work well with OVRPlayerController and other things that move the avatar around.

mfmf
Oculus Staff

pjenness said:

Is it just me, or has the original functionality of OVRTouch gone where a fist or pointed finger generated collision detection?
I used to be able to punch rigid bodies, but not with the new Avatar based hands.



The custom hand sample will have that functionality when we release it later.

mfmf
Oculus Staff


Why is it so jittery when you hold an object? This should work even if Interpolate is set to none. I have a touch project with grab that I made from Ben Roberts' YouTube tutorial, and that works without the jitter. There has to be another solution than changing the Physics time step.


It's jittery because you're updating the object's position at a different frame rate than that of the HMD or of the hand. With that exact approach, it's going to be jittery.

There are many other approaches. Parenting is a simple one. If you need solid physics, you can separate the visual mesh and rigid body, and parent the visual mesh while updating the rigid body at a lower frame rate.

StormyDoesVR
Heroic Explorer
@pjenness

Thanks for the code sample! Just having a bit of trouble, it doesn't recognize the word User and continues to error out. Any idea why that could be happening?
------------------------------
PC specs - GTX 1070 (MSI AERO); 16GB Corsair Vengeance DDR3; intel i5-4590, MSI motherboard (don't remember but it was a combo deal) and an EVGA Supernova 750W semi-modular.

pjenness
Rising Star


mfmf said:



Why is it so jittery when you hold an object? This should work even if Interpolate is set to none. I have a touch project with grab that I made from Ben Roberts' YouTube tutorial, and that works without the jitter. There has to be another solution than changing the Physics time step.


It's jittery because you're updating the object's position at a different frame rate than that of the HMD or of the hand. With that exact approach, it's going to be jittery.

There are many other approaches. Parenting is a simple one. If you need solid physics, you can separate the visual mesh and rigid body, and parent the visual mesh while updating the rigid body at a lower frame rate.



I managed to have the best of both worlds with a bit of a hack:
https://forums.oculus.com/community/discussion/comment/478680/#Comment_478680

The visual renderers become parented so looks smooth, and the collision objects remain outside, and use rigidbody movement to do correct collision detection.
So far has been working well.

-P

Drift VFX Visual, Virtual , Vertical Want 970GTX on Macbook for good FPS? https://forums.oculus.com/viewtopic.php?f=26&t=17349

pjenness
Rising Star

Rave185 said:

@pjenness

Thanks for the code sample! Just having a bit of trouble, it doesn't recognize the word User and continues to error out. Any idea why that could be happening?


HIya

Do you have this at the start of your script file?

using Oculus.Platform.Models;


-P
Drift VFX Visual, Virtual , Vertical Want 970GTX on Macbook for good FPS? https://forums.oculus.com/viewtopic.php?f=26&t=17349

StormyDoesVR
Heroic Explorer
you're very welcome @KevinLongtime ! Glad I could help! It's been about a week since I looked at it, so let me think about your second question for a sec... I believe most of it is offshoots from Ben Robert's tutorials on youtube, but yeah, I did change a couple things.
------------------------------
PC specs - GTX 1070 (MSI AERO); 16GB Corsair Vengeance DDR3; intel i5-4590, MSI motherboard (don't remember but it was a combo deal) and an EVGA Supernova 750W semi-modular.

Anonymous
Not applicable

Rave185 said:

Still having trouble getting custom player avatars working though. The generic fallback avatar works, but no custom ones.


The Unity OvrAvatar is pre-wired to personalize avatars if you set the avatar's oculusUserID to a valid user ID. There's a trick though. Because the avatar gets created in the Start() function, you have to request and receive the user ID in an Awake() function. If you try to do it in Start(), it's too late.

Note: Even with Awake(), you can still get into a race condition because the call to get the user ID is asynchronous. You don't know how long it's gonna take.

The new docs will be live soon with some sample code that addresses all these gotchas in a straight-forward way.

The sample adds a new class that initializes Oculus Platform in an Awake() function, requests the ID of the logged in user, and then waits until the callback function updates the oculusUserID value of your avatar.

  1. Import the Oculus Platform SDK's unity package into your avatar project.
  2. Get a real App ID from API tab of your app in the Developer Dashboard. If you haven't made an app there, go to https://dashboard.oculus.com to set yourself up as a game dev and create your first app.
  3. Copy your App ID string into the "Rift app ID" setting in the Oculus Avatars and Oculus Platform plugins.
  4. Add this script to an empty game object:
PlatformManager.cs 

using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;

public class PlatformManager : MonoBehaviour {
    void Awake () {
        Oculus.Platform.Core.Initialize();
        Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
        Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start().
    }

    private void GetLoggedInUserCallback(Message<User> message) {
        if (!message.IsError) {
            OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[];
            foreach (OvrAvatar avatar in avatars) {
                avatar.oculusUserID = message.Data.ID;
            }
        }
      }
  }