Hey folks,I'm sure you've seen the latest updates to Oculus Avatars,
which introduced OVRLipsync driven mouth movement, eye gaze simulation
and micro-expressions.I wanted to flag something we came up against
while we worked on this update, in case it...
CONFIDENTIALWe appreciate your discretion during this early access
period.Please don't discuss these updates or share publicly any images
or video of expressive Avatars in action.We'd love your feedback. Please
keep all comments / discussions regardi...
CONFIDENTIALWe appreciate your discretion during this early access
period.Please don't discuss these updates or share publicly any images
or video of expressive Avatars in action.We'd love your feedback. Please
keep all comments / discussions regardi...
With the introduction of gaze targets and eye behaviors into the avatar
SDK, I thought it would be good to outline how gaze works, and some
thoughts on gaze targetsWe've built gaze targeting to be very simple in
the first version of the expressive up...
CONFIDENTIALWe appreciate your discretion during this early access
period.Please don't discuss these updates or share publicly any images
or video of expressive Avatars in action.We'd love your feedback. Please
keep all comments / discussions regardi...
Hey there, @carrotstien -- we're looking to address GC, load times and
runtime memory allocation in 1.38 and 1.39. Stay tuned!I've also
escalated your fix to the team. Thanks for flagging.
hey @"goohyun.jung.1" , check out this link in our docs:
https://developer.oculus.com/documentation/avatarsdk/latest/concepts/avatars-gsg-unity/#custom-touch-grip-poses
Hey @knchaffin We're working hard to get the native expressive sample
wrapped up and ready to go in the next couple of releases. Apologies for
the delay.
@programistunity -- each of your apps will currently have a different
app-scoped ID for a given user. In order to enable them to co-mingle,
you need to first set up an app grouping (see documentation). This
allows both apps to be speaking in terms of...