Below is a brief portfolio of my VR projects. Clicking the button on the right will take you to my Github repository where you can access the code for the apps alongside other project files.
In collaboration with Pearson Education - XR Bootcamp Industry Partner.
Unity VR game created using Unity Editor 2021.13 version.
The game was developed during XR Bootcamp in response to a brief set by Pearson Education (the client)
Universal Render Pipeline with custom materials, skybox and lightmaps
Timeline:
Prototype developed within 25 hours spread over a week (1 person team).
MVP application developed within 150 hours spread over two weeks (3 person team). including design workshops with Pearson.
Key features:
- extended Unity XRIT SDK
- use of Navmesh terrain for NPC character navigation
- 2 different language options for language learning (English and Spanish)
- different game mechanics, including 'feeding words' to monsters and shooting words
- 3-level gameplay design, including an animated main menu scene
- integrated UI design, including score board, main and in-game menus
- integrated audio game narration, using VoxBox text-to-speech.
- audio and tactile feedback
- accessibility considerations, including use of teleportation areas alongside continuous movement
- code refactored to fully incorporate principles of OOP and allow future extendibility
- Unity Profiler used to optimise the FPS rate during gameplay
The game was built as an APK package for a Oculus / Meta Quest 2 headset.
The app is based on an idea that language learning can be accelerated through imaginative storytelling and exposure through light-hearted interactions. The game narrative sees the player stranded in a galaxy of foreign language snack-word planets, where he meets friendly word-gobbling monsters and fights enemies in order to gather parts required to rebuild his spaceship.
In VR, words become interactable items that can be fed to monsters or used as sleep-inducing weapon, making language learning seamless and engaging.
Unity VR game created using Unity Editor 2021.13 version.
The game was developed during XR Bootcamp as a two-week group MVP project.
Universal Render Pipeline with custom materials, skybox and lightmaps
Timeline:
MVP application developed within 150 hours spread over two weeks (3 person team).
Key features:
- Cross-platform development (Quest 2 & Valve Index)
- Autohand SDK
- Teleportation-based Sequence Manager, including activity checkpoints
- Guided interactions throughout
- Interactive dialogue with different branches creted using the Yarnspinner package
- integrated audio dialogue, using VoxBox text-to-speech.
- Integrated UI following collaborative prototyping in Shapes XR.
- Code refactored to fully incorporate principles of OOP and allow future extendability
- Unity Profiler used to optimise the FPS rate during gameplay
The app goal is to make remote first aid training engaging and memorable by presenting the player with realistic scenarios and active learning and learning through doing instead of passive learning followed by quizzes, which is the current norm.
Unity VR game created using Unity Editor 2021.13 version.
The game was developed during XR Bootcamp as a one-week prototyping project. The prototype won an Accessibility Award at the end of the course.
Universal Render Pipeline with custom materials, skybox and lightmaps
Timeline:
Prototype developed within 25 hours spread over a week.
Key features:
- Developed for Quest 2
- Oculus Integration SDK
- Hand tracking and hand pose detection
- Hand poses inventory based on ASL (American Sign Language) fingerspelling
- Hand pose skeleton debugging - guiding the user to the correct hand pose
- Finger-spelling word challenge
- Integrated UI
The accurate hand tracking feature of the Oculus Integration SDK offers a new exciting way of bringing sign gesture learning into VR:
Unity VR game created using Unity Editor 2020.3 version.
Universal Render Pipeline with custom materials, skybox and lightmaps
Key features:
- 3-level gameplay design
- integrated UI design, including score board, main and in-game menus
- integrated audio game narration
- audio and tactile feedback
- custom C# code behaviours assigned to game objects, including integrated Game Manager
- use of bow an arrow in VR based on VR with Andrew Youtube tutorial
- use of Navmesh terrain for enemy navigation
- grabbable objects / trigger events
- accessibility considerations, including use of teleportation areas alongside continuous movement
- code refactored to fully incorporate principles of OOP
- Unity Profiler used to optimise the FPS rate during gameplay
The game was built as an Android package for a Oculus / Meta Quest 2 headset.
The idea was to create an immersive shooter game, set in a forest where the player is tasked with protecting a pen of chickens from hungry zombies.
The use of bow and arrow in VR augments the game play by introducing a realistic physical dynamic which normally lacks in 3D games.
The challenge of providing a non-intrusive UI into a VR environment is solved through extensive audio narration.
Unity VR game created using Unity Editor 2020.3 version.
Rendered using Universal Render Pipeline with custom materials, skybox and lightmaps
Key features:
- 3-level gameplay design + increased difficulty timed rounds ('Mass attack')
- data persistence between sessions - top score displayed and loaded from a JSON file
- custom C# code behaviours assigned to game objects, including integrated Game Manager
- integrated UI design, including dynamic scoreboard on the chips
- audio and tactile feedback
The game was built as an Android package for a Oculus / Meta Quest 2 headset.
The premise of the game is to save your chips from the ravenous seagulls flying towards the player at different speed and frequencies.
Although the main game concept is incredibly simple, the game play dynamic is varied and kept interesting by increasing difficulty level and randomised mass attach events, where the seagulls travel toward the player at increased speed.
Unity VR game created using Unity Editor 2021.13 version.
The game was developed during XR Bootcamp as a one-week prototyping project.
Universal Render Pipeline with custom materials, skybox and lightmaps
Timeline:
Prototype developed within 25 hours spread over a week.
Key features:
- Developed for Quest 2
- Oculus Integration SDK
- Hand tracking & controller tracking
- Hand pose detection for swiping the cards
- Integrated UI
How do you provide a customisable learning experience? Learning, particularly memorising can be tedious as it requires long concentration spans in a world which is usually full of distractions. Bringing learning experiences into VR:
My key priority was to provide a comfortable and fun UI that encourages the user to keep coming back to the experience.
Copyright © 2021 agnieszka zielke - All Rights Reserved.
Powered by GoDaddy Website Builder