Experiential Design - Final Project : Completed Experience

21/04/2025 -  ( Week 11 - Week 14 )

Ho Winnie / 0364866 

Experiential Design / Bachelor's of Design Honors In Creative Media 

Final Project : Completed Experience





1. Final Project : Completed Experience

Requirements : 

Students will synthesise the knowledge gained in task 1, 2 and 3 for application in task 4. Students will create integrate visual asset and refine the prototype into a complete working and functional product experience.

1. Project file and Folders
2. Application installation files (APK for android, iOS Build Folder for
IOS/iPhones)
3. Video walkthrough (Presentation)

A. Recap Of Completed Features Done In Task 3 : Click HERE for the blog post :

Below is the breakdown of the MystiAR MVP we worked on in our unity prototype that i  worked together with my team mate Lew Guo Ying and finished :

1️⃣ Feeding the Pet ( Done by Guo Ying )
One of the core interactions is feeding. Users can tap to spawn the cat bowl and watch as their companion happily eats. This small gesture helps users feel a daily sense of care and connection. As the pet eats, its health or mood bar visibly increases, reinforcing the impact of the user’s actions.

2️⃣ Sleep Mode – Play Soft Music ( Done by Me )
Sometimes, companionship simply means peaceful presence. In Sleep Mode, users can switch their pet to ambient mode, where it lies down and gently sleeps beside them in AR. While the pet rests, soft background music plays to create a soothing atmosphere—perfect for unwinding, studying, or calming anxiety. User can toggle a button to on and off music.

3️⃣ Crafting Potion by Finding 3 Ingredients ( Done by Me )
To add a touch of fantasy, MystiAR includes a magical potion crafting feature. Users collect and drag three mystical ingredients into an AR cauldron placed in their environment. As the potion brews, animated effects and sparkles make the process feel alive and magical. The finished potion can then be given to the pet to boost its health, mood, or special traits, deepening the sense of interactive care.

4️⃣ Playing Fetch with the Pet  ( Done by Guo Ying )
No pet is complete without playful moments. Users can tap a ball icon to spawn a 3D ball, then swipe or drag to throw it in the AR scene. The pet reacts by chasing and fetching the ball, wagging its tail or jumping with joy. This playful interaction strengthens the bond between user and pet while gently lifting the pet’s mood bar, making the relationship feel responsive and real.

B. What we will be working on in our final completed experience : 

1️⃣ Voice Command Feature ( Done by Guo Ying )
When the user calls their pet’s name using the voice command button, the pet will automatically walk into the scene’s visible canvas area. This removes the need for the user to scan the surroundings to find where the pet is — making the interaction feel more natural, responsive, and convenient.

2️⃣ Adding more sound effects , particle effects animation to all features ( Done by Me )
To make each interaction more lively and intuitive, we plan to add sound cues and particle animations to all core features. For example, when the pet finishes eating, cute heart particles will appear to show satisfaction, and when the pet enters sleep mode, a calming “Zzz” effect will play along with soft ambient visuals.

3️⃣ Workable UI health and mood bar ( Done by Guo Ying ) 
Currently, the health and mood bars displayed above the pet are static because we haven’t fully implemented the logic to update them yet. The next step is to write scripts that dynamically increase the health bar when the pet eats a potion and boost the mood bar when the pet plays fetch. 

4️⃣Screenshot Feature – Capture The Moment ( Done by Me )
We will add a screenshot feature that lets users capture their AR pet directly within the app. By tapping a camera icon on the screen, users will be able to take a snapshot of their pet in their real environment. Once the screenshot is taken, it will be automatically saved to the device’s gallery, making it easy for users to share their favorite moments with friends or post them on social media.



Progression For My Parts : 

C. Adding Effect Particles 

As discussed earlier with Mr. Razif, he suggested adding more particle effects to make the overall experience feel more lively and realistic. Based on his feedback, we decided to implement the following effects:
  • ❤️ Heart effect – Appears when the pet finishes eating, showing happiness and satisfaction.

  • 🫧 Bubble effect – Triggered after the pet consumes a potion, emphasizing the magical brewing outcome.

  • Light sparkle effect – Plays while the pet is fetching the ball, adding energy and fun to the interaction.

  • 💤 “Zzz” effect – Displays when the pet is sleeping, reinforcing the calm and cozy atmosphere.

These small details aim to make MystiAR more expressive and engaging, helping users feel that their pet is truly reacting to their care and interactions.


Setting Up The Particle System : 

To set up a custom image for a Unity particle system, I first imported the texture (such as a heart or “zzz” icon) and adjusted its import settings. In the Inspector, I made sure the Texture Type was set to Default and the Alpha Source was enabled so that the transparent parts of the image would render properly in the particle effect. This ensures the texture can be used as a sprite-like particle shape.

Next, inside the Particle System settings, I selected the Alpha Blended option under Particles. This blending mode allows the particle to display with transparency, so soft edges or semi-transparent areas in the texture appear smooth when rendered.

Finally, I assigned the correct shader by going to Shaders → Legacy Shaders → Particles → Alpha Blended. This shader is important because it tells Unity how to render the particle with transparency and proper lighting. Once done, I could apply the texture to the particle material, enabling effects like floating hearts, sparkles, or “zzz” bubbles to visually enhance interactions in MystiAR.

Fig 1.1 Particle System

Adjusting Particle Settings :

After importing and assigning the textures to the particle system, the next step was adjusting the particle settings to achieve the desired visual effect.

In Unity's Inspector panel, I fine-tuned parameters such as Start Size, Start Lifetime, and Start Speed to control how big the particles appear, how long they last, and how fast they move. For example, for the heart effect, I kept the size moderate and gave it a gentle upward motion to simulate floating hearts.

I also modified the Shape module to define how particles spawn (e.g., from a sphere or cone), and enabled Color over Lifetime to create a fading effect as particles disappear. Finally, I adjusted the Emission Rate to control how many particles appear at a time, making sure the effect looks smooth and not overcrowded. These settings combined create a polished and visually appealing animation that matches the interaction—for example, hearts floating when the pet eats or “zzz” icons rising when it sleeps.


Fig 1.2 Adjusting Heart & Zzz Particles

Here, I set up two different particle effects—one for a bubble effect and one for a sparkle highlight effect—to enhance the interaction feedback.

On the left, the bubble particle system is configured with a longer Start Lifetime and moderate Start Size, allowing bubbles to rise slowly and appear more fluid. The Gravity Modifier is set low to give the bubbles a floating motion, and the Emission Rate is tuned to create a steady stream without overcrowding the screen.

On the right, the highlight sparkle effect uses a very short Duration and Start Lifetime, with smaller Start Size and faster Start Speed. This creates a quick, snappy sparkle that appears and fades rapidly, perfect for actions like potion completion or successful interactions. By tweaking properties like Simulation Space, Rotation, and Color over Lifetime, I achieved two visually distinct effects that complement the user experience.

Fig 1.3 Adjusting Bubble & Sparkle Particles

Effect Manager Script : 

This step focuses on the EffectManager script, which centralizes the spawning and control of different particle effects for the pet’s interactions.

The script declares references for different effect prefabs, such as eatEffectPrefab, drinkEffectPrefab, playBallEffectPrefab, and sleepEffectPrefab, along with Transform variables to define spawn points on the pet (e.g., mouth, head, or body). A singleton pattern is implemented by assigning Instance = this; inside Awake(), ensuring that other scripts can easily access the effect manager without repeated instantiation.

For example, the method

spawns the eating effect at the pet’s mouth, attaches it to the correct transform, and destroys it after 2 seconds to prevent clutter. Similar methods exist for playing effects, sleep effects, and ball effects, making it easy to trigger visual feedback whenever the pet performs an action.

This modular design allows all interaction scripts (feeding, potion drinking, playing fetch, and sleep mode) to simply call functions like EffectManager.Instance.PlayEatEffect(), making the codebase cleaner and easier to expand with new effects later.

Fig 1.4 Effect Manager Script

Final Settings : 

This final step shows how the EffectManager script is linked with the pet model and its specific transform points to make the visual effects functional in real-time.

On the left, you can see the Inspector setup where the script references are assigned: Eat Effect Prefab, Drink Effect Prefab, Play Ball Effect Prefab, and Sleep Effect Prefab are populated with the corresponding particle effect prefabs created earlier. The Mouth Point, Body Point, and Head Point fields are linked to the pet model’s transforms, ensuring that effects spawn at the correct locations (e.g., hearts appear near the mouth while eating, bubbles near the mouth when drinking, sparkles at the body when playing).

In the middle panel, the cat model is positioned in the scene, while on the right, the Hierarchy window shows how specific transform points (MouthPoint, BodyPoint, HeadPoint) are placed as children of the cat model. This setup finalizes the integration—whenever a user feeds, plays with, or puts the pet to sleep, the correct particle effects will appear seamlessly, making the interaction more polished and immersive.

Fig 1.5 Final Settings


Challenges Faced & Solutions :

Building the Particle Effect System was an exciting but challenging task. The first hurdle was learning how to convert 2D textures (like hearts, sparkles, and bubbles) into usable particle effects in Unity. I had to configure the textures properly by setting their Texture Type to Sprite (2D and UI) and changing their shader to Legacy Shaders → Particles → Alpha Blended. This step was essential to ensure transparency worked as expected when the particles were rendered.

Another challenge came when adjusting the particle behavior, such as size, lifetime, and emission rate. For instance, effects like bubbles required a slower upward movement, while sparkles had to disappear quickly. I solved this by fine-tuning parameters like Start Lifetime, Start Speed, and Gravity Modifier in the Particle System Inspector, ensuring each effect matched the interaction – hearts floating up after eating, bubbles appearing when drinking a potion, sparkles while playing ball, and Zzz icons while sleeping.

Finally, I needed a modular way to trigger these effects dynamically. I created an EffectManager script that stores references to each prefab and spawns the correct effect at the pet’s position. For example, public void PlayEatEffect() instantiates the heart effect at the pet’s mouth, while public void PlayBallEffect() triggers sparkles near the body point. This structure made it easy to call effects from other scripts, keeping the system flexible and reusable across different features.


D. 
Screenshot Feature – Capture The Moment

Next, I will be working on the Screenshot Feature – Capture The Moment, which allows users to take snapshots of their AR pet in different moments and poses. The feature will include two modes:

  • With UI: Captures the entire AR scene together with the UI elements, such as the health bar, mood bar, and action buttons, perfect for sharing gameplay moments.

  • Without UI: Captures only the pet and the AR environment without any UI overlays, giving users a clean and immersive photo of their pet.

Once a screenshot is taken, it will be saved directly to the user’s gallery, making it easy to keep memories or share their pet’s cutest moments with friends. This feature enhances user engagement by letting them personalize and capture their favorite interactions with the pet.

Setting Up The Screenshot Camera : 

The first step in implementing the screenshot feature is to set up a dedicated Camera object in Unity that will be used for capturing the AR pet. This camera is placed in the scene and configured separately from the main AR camera to ensure that the screenshots can be taken with or without UI.

On the left side of the screenshot, you can see the GameObject hierarchy, where a new ScreenshotCamera is created. Its Transform settings are adjusted to match the pet’s view, ensuring it captures the correct angle. The Camera component is configured with the appropriate projection, clipping planes, and render settings to ensure the pet and environment are properly rendered.

Additionally, the UI Canvas is set up with toggles to control whether the UI is included in the screenshot. This setup makes it possible to capture both UI-inclusive shots and clean shots without any overlays, giving the user flexibility when saving moments of their AR pet.


Fig 1.6 Setting Up Screenshot Camera 


Adding UI & Capture Control : 

In this step, UI elements are created to allow the user to trigger the screenshot function and choose whether to capture the image with or without UI overlays.

On the left side of the image, you can see a UI Panel with buttons labeled Save and Share. These buttons are connected to scripts that will handle the screenshot capture, save the image to the gallery, or share it. The Canvas Inspector on the right shows that these buttons are linked to specific scripts and functions to execute the capture when pressed.

The ScreenshotManager script is also assigned in this step. This script contains methods to render the scene from the screenshot camera, hide the UI when needed, and save the captured image. By organizing the UI and logic here, users will have a smooth experience capturing and saving moments of their AR pet in different modes.

Fig 1.7 Adding UI & Capture Controls

Testing Screenshot Feature : 

In this step, the screenshot functionality is fully integrated and tested. The UI buttons are linked to the final ScreenshotManager methods, which handle capturing the screen, saving the image to the gallery, and toggling between showing or hiding the UI.

The inspector view confirms that the button onClick events are properly set up to call functions such as CaptureWithUI() or CaptureWithoutUI(). The Game window preview shows how the screenshot will look in each mode. By pressing the buttons, users can instantly take a snapshot of their pet in AR, either keeping the UI visible or hiding it for a cleaner look.

Fig 1.8 Testing Screenshot Features


Screenshot Manager Script : 

This script implements the full logic for the Screenshot Feature – Capture The Moment, handling both UI and non-UI capture modes and saving the screenshot to the device gallery.

The script begins by defining references to the camera, UI elements, and settings for screenshot capture. For example:


These variables manage which camera will be used for rendering, the UI feedback after taking a screenshot, and whether UI elements should be shown during capture.

The core functionality is split into multiple methods. CaptureWithUI() and CaptureWithoutUI() toggle the UI visibility and call ScreenshotToTexture(), which handles the actual capture process using a RenderTexture and saving it to a Texture2D. For instance:


This ensures the screenshot is taken cleanly without UI elements.

The SaveScreenshot() method writes the image to persistent storage and uses a timestamp-based file name for uniqueness:


Finally, HideFeedback() deactivates the confirmation text after saving. Together, these functions allow users to capture AR pet moments with or without UI overlays, preview feedback on screen, and save the result to their gallery, making the feature user-friendly and versatile.

Fig 1.9 Full Screenshot Manager Script


Finalizing Screenshot Workflow : 

The final step ensures that the screenshot feature is fully functional and polished. Here, the Unity Inspector shows that the button click events are properly linked to the screenshot logic. The ScreenshotManager component is assigned with references to the capture camera, render texture, and UI toggling options.

The Game window preview verifies that pressing the button successfully captures the AR pet with or without the UI, depending on the selected option. The screenshot is then saved to the device gallery, allowing users to share or keep their captured moments.

At this stage, the feature is tested to ensure that all buttons work correctly, the UI toggling behaves as intended, and the saved images retain the correct resolution and framing. This completes the screenshot system, giving users a seamless way to capture and save memories of their AR pet in different modes.


Fig 2.0 Finalizing Screenshot Workflow 

Challenges Faced & Solutions : 

Working on the Screenshot Feature – Capture The Moment came with several challenges. One of the biggest issues was handling screenshots with and without UI elements. By default, Unity captures the entire screen, including overlays, so I needed to figure out how to temporarily hide the UI without breaking the user experience. I solved this by toggling the CanvasGroup.alpha value before and after capturing the screenshot, as seen in CaptureWithoutUI(), ensuring a clean capture without permanently disabling the UI.

Another challenge was saving the screenshot to the device gallery with unique filenames. Initially, screenshots would overwrite each other, which made it hard to manage saved images. I implemented a solution in SaveScreenshot(), where I generated timestamp-based filenames using System.DateTime.Now.ToString("yyyy_MM_dd_HH_mm_ss"). This ensured that every screenshot had a unique identifier, preventing accidental overwrites.

Lastly, optimizing user feedback after taking a screenshot was crucial. Without visual confirmation, users wouldn’t know if the screenshot was successful. To address this, I added a simple feedback text that appears after capture and automatically hides using the HideFeedback() method. These solutions together made the screenshot feature functional, intuitive, and user-friendly, ensuring players could easily save memorable moments of their AR pet.


Final Submission : 

Google Drive Link Click HERE - Includes project files , APK downloads etc

MystiAR Walkthrough Video -





MystiAR Presentation Video - 



Presentation Slides -
Experiential Design Final by Winnie Ho


2. Feedback

Week 14 : 
Mr Razif complimented the completeness of this final project and stated that we successfully created the experience we proposed from the start. 

3. Reflection

Challenges Faced & Solutions :

Working on MystiAR gave me hands-on experience with several complex Unity features, including the screenshot feature, particle system, potion crafting system, and sleep mode. Each of these features came with unique challenges that required both technical problem-solving and teamwork.

For the screenshot feature, the main challenge was capturing the AR scene with and without UI while ensuring the image saved properly to the gallery. I solved this by using a RenderTexture-based approach to capture the camera output, combined with logic to temporarily hide UI elements before taking the screenshot. Implementing a toggle for UI visibility made it possible for users to choose their preferred screenshot mode.

The particle system was difficult because I had to convert static images into dynamic effects and adjust parameters like lifetime, speed, and size to fit each interaction. By learning how to configure textures, shaders, and emission properties in Unity, I built an EffectManager script to modularize all effects and trigger them in the right context – hearts for eating, bubbles for potion, sparkles for playing, and Zzz for sleeping.

For the potion crafting system, the biggest hurdle was making draggable ingredients work intuitively in AR. I created DraggableIngredient and DraggablePotion scripts with raycasting logic to track touch input and detect when objects were dropped into the cauldron. This was linked to the CauldronManager, which handled potion brewing and spawning the final potion.

Lastly, the sleep mode feature required UI animations and music toggling. The challenge was syncing the pet animation state with the music player while allowing users to toggle the ambient music. I solved this by building a MusicToggleButton script that controlled both the audio source and UI feedback.

Overall, the key solutions involved breaking each problem into smaller parts, writing modular scripts (like EffectManager, CauldronManager, and Draggable scripts), and integrating animations, UI, and logic cohesively. Team collaboration and Mr. Razif’s feedback helped refine these features, especially in polishing the prototype with particle effects and sound for a more immersive experience.

Experience : 

Working on MystiAR was both exciting and extremely challenging. Developing features like the screenshot system, particle effects, potion crafting, and sleep mode required me to constantly troubleshoot complex scripts, animation states, and UI interactions. There were many sleepless nights spent debugging raycasting issues, fixing object interactions, and ensuring the AR elements worked smoothly together. Despite the difficulties, it was a rewarding experience to see each feature come to life after multiple iterations of testing and refining.

Observation :

Through this process, I realised how important teamwork and constant communication were in completing such a complex project. My partner ( Guo Ying ) and I continuously updated each other, shared solutions, and tested each other’s work to ensure all features integrated well. Feedback from Mr. Razif also helped us refine the prototype, especially by adding particle effects and sound to make the app feel more polished. Good collaboration made problem-solving easier and kept us motivated even when we were stuck.


Findings : 

Overall, this project taught me the value of perseverance, modular coding, and effective communication. Each feature became easier to build once I broke the tasks into smaller, manageable parts and worked closely with my partner to integrate everything. While the journey was full of late nights and debugging struggles, the final result was very satisfying. I am proud of what we achieved together and grateful for the teamwork that made the whole process possible.

Comments

Popular posts from this blog

Application Design 2 - Task 1 : App Design 1 Self Evaluation & Reflection

Application Design 2 - Task 2 : Interaction Design Proposal & Planning

Information Design - Exercise 1 : Quantifiable Information