Personal projects
XR Hand Pose Toolkit
A package I’m releasing to the Unity Asset store Q3 2024. The toolkit is intended to provide developers with the ability to enhance the feel of hand-tracking interactions for games and similar. It will provide tools allowing creators to:
Easily craft custom hand poses for each XR Interactable, with various directional assist tools, in play-mode, using Hand-tracking on compatible headsets, then serializing the poses into Scriptable Objects.
Locally blend the visual representation of the hand into predefined poses when approaching a given interactable containing one such pose. The result is a more smooth hand interaction experience, eg. more consistent grabbing/pulling feels, rather than simply sharply snapping the hand into position on an object.
Blend between the created pose of a grab interaction, and the created pose used to “activate” that grab(ie. holding a weapon, and pulling the trigger).
Preview the created poses on a given interactable instance and adjust its position and rotation in that specific context.
Modularis
As a fun experiment, as well as for my internal use in semester projects, I created a Unity package inspired by Ryan Hipple’s talk at Unite Austin 2017 about Game Architecture, see this link.
The package contains:
Systems and helpers to allow the use of Scriptable Objects as modular event systems, to modularize/decouple code in a clean way.
Custom editor scripts to help facilitate debugging and functionality testing of these scriptable objects
Subscribable “OnValueChange” events on data containers, allowing for global access event systems.
AVDU - Automatic Virtual Desktop Injector for Unity
Virtual Desktop is an application that lets users stream Virtual Reality(VR) content such as games or 3D entertainment to their VR headset wirelessly. This feature also works for the ‘play-mode’ in the Unity Editor, but to do so requires a manual injection (at the time of writing) of Virtual desktop, and information specific to each Unity project, every time the Unity Editor is launched.
This was a very tedious process for wireless VR development, so I wrote an editor script that automatically starts the injection process when the project is opened using information about the project currently being launched, by running a PowerShell process that relaunches the launched project as injected.