Experiential Design - Final Project : Completed Experience
21/04/2025 - ( Week 11 - Week 14 )
Ho Winnie / 0364866
Experiential Design / Bachelor's of Design Honors In Creative Media
Final Project : Completed Experience
1. Final Project : Completed Experience
1️⃣ Feeding the Pet ( Done by Guo Ying )
One of the core interactions is feeding. Users can tap to spawn the cat bowl and watch as their companion happily eats. This small gesture helps users feel a daily sense of care and connection. As the pet eats, its health or mood bar visibly increases, reinforcing the impact of the user’s actions.
2️⃣ Sleep Mode – Play Soft Music ( Done by Me )
Sometimes, companionship simply means peaceful presence. In Sleep Mode, users can switch their pet to ambient mode, where it lies down and gently sleeps beside them in AR. While the pet rests, soft background music plays to create a soothing atmosphere—perfect for unwinding, studying, or calming anxiety. User can toggle a button to on and off music.
3️⃣ Crafting Potion by Finding 3 Ingredients ( Done by Me )
To add a touch of fantasy, MystiAR includes a magical potion crafting feature. Users collect and drag three mystical ingredients into an AR cauldron placed in their environment. As the potion brews, animated effects and sparkles make the process feel alive and magical. The finished potion can then be given to the pet to boost its health, mood, or special traits, deepening the sense of interactive care.
4️⃣ Playing Fetch with the Pet ( Done by Guo Ying )
No pet is complete without playful moments. Users can tap a ball icon to spawn a 3D ball, then swipe or drag to throw it in the AR scene. The pet reacts by chasing and fetching the ball, wagging its tail or jumping with joy. This playful interaction strengthens the bond between user and pet while gently lifting the pet’s mood bar, making the relationship feel responsive and real.
B. What we will be working on in our final completed experience :
Currently, the health and mood bars displayed above the pet are static because we haven’t fully implemented the logic to update them yet. The next step is to write scripts that dynamically increase the health bar when the pet eats a potion and boost the mood bar when the pet plays fetch.
-
❤️ Heart effect – Appears when the pet finishes eating, showing happiness and satisfaction.
-
🫧 Bubble effect – Triggered after the pet consumes a potion, emphasizing the magical brewing outcome.
-
✨ Light sparkle effect – Plays while the pet is fetching the ball, adding energy and fun to the interaction.
-
💤 “Zzz” effect – Displays when the pet is sleeping, reinforcing the calm and cozy atmosphere.
These small details aim to make MystiAR more expressive and engaging, helping users feel that their pet is truly reacting to their care and interactions.
Next, inside the Particle System settings, I selected the Alpha Blended option under Particles. This blending mode allows the particle to display with transparency, so soft edges or semi-transparent areas in the texture appear smooth when rendered.
Finally, I assigned the correct shader by going to Shaders → Legacy Shaders → Particles → Alpha Blended. This shader is important because it tells Unity how to render the particle with transparency and proper lighting. Once done, I could apply the texture to the particle material, enabling effects like floating hearts, sparkles, or “zzz” bubbles to visually enhance interactions in MystiAR.
In Unity's Inspector panel, I fine-tuned parameters such as Start Size, Start Lifetime, and Start Speed to control how big the particles appear, how long they last, and how fast they move. For example, for the heart effect, I kept the size moderate and gave it a gentle upward motion to simulate floating hearts.
I also modified the Shape module to define how particles spawn (e.g., from a sphere or cone), and enabled Color over Lifetime to create a fading effect as particles disappear. Finally, I adjusted the Emission Rate to control how many particles appear at a time, making sure the effect looks smooth and not overcrowded. These settings combined create a polished and visually appealing animation that matches the interaction—for example, hearts floating when the pet eats or “zzz” icons rising when it sleeps.
| Fig 1.2 Adjusting Heart & Zzz Particles |
Here, I set up two different particle effects—one for a bubble effect and one for a sparkle highlight effect—to enhance the interaction feedback.
On the left, the bubble particle system is configured with a longer Start Lifetime and moderate Start Size, allowing bubbles to rise slowly and appear more fluid. The Gravity Modifier is set low to give the bubbles a floating motion, and the Emission Rate is tuned to create a steady stream without overcrowding the screen.
On the right, the highlight sparkle effect uses a very short Duration and Start Lifetime, with smaller Start Size and faster Start Speed. This creates a quick, snappy sparkle that appears and fades rapidly, perfect for actions like potion completion or successful interactions. By tweaking properties like Simulation Space, Rotation, and Color over Lifetime, I achieved two visually distinct effects that complement the user experience.
![]() |
| Fig 1.3 Adjusting Bubble & Sparkle Particles |
Effect Manager Script :
The script declares references for different effect prefabs, such as eatEffectPrefab, drinkEffectPrefab, playBallEffectPrefab, and sleepEffectPrefab, along with Transform variables to define spawn points on the pet (e.g., mouth, head, or body). A singleton pattern is implemented by assigning Instance = this; inside Awake(), ensuring that other scripts can easily access the effect manager without repeated instantiation.
For example, the method
spawns the eating effect at the pet’s mouth, attaches it to the correct transform, and destroys it after 2 seconds to prevent clutter. Similar methods exist for playing effects, sleep effects, and ball effects, making it easy to trigger visual feedback whenever the pet performs an action.
This modular design allows all interaction scripts (feeding, potion drinking, playing fetch, and sleep mode) to simply call functions like EffectManager.Instance.PlayEatEffect(), making the codebase cleaner and easier to expand with new effects later.
| Fig 1.4 Effect Manager Script |
Final Settings :
This final step shows how the EffectManager script is linked with the pet model and its specific transform points to make the visual effects functional in real-time.
On the left, you can see the Inspector setup where the script references are assigned: Eat Effect Prefab, Drink Effect Prefab, Play Ball Effect Prefab, and Sleep Effect Prefab are populated with the corresponding particle effect prefabs created earlier. The Mouth Point, Body Point, and Head Point fields are linked to the pet model’s transforms, ensuring that effects spawn at the correct locations (e.g., hearts appear near the mouth while eating, bubbles near the mouth when drinking, sparkles at the body when playing).
In the middle panel, the cat model is positioned in the scene, while on the right, the Hierarchy window shows how specific transform points (MouthPoint, BodyPoint, HeadPoint) are placed as children of the cat model. This setup finalizes the integration—whenever a user feeds, plays with, or puts the pet to sleep, the correct particle effects will appear seamlessly, making the interaction more polished and immersive.
| Fig 1.5 Final Settings |
Challenges Faced & Solutions :
Next, I will be working on the Screenshot Feature – Capture The Moment, which allows users to take snapshots of their AR pet in different moments and poses. The feature will include two modes:
-
With UI: Captures the entire AR scene together with the UI elements, such as the health bar, mood bar, and action buttons, perfect for sharing gameplay moments.
-
Without UI: Captures only the pet and the AR environment without any UI overlays, giving users a clean and immersive photo of their pet.
Once a screenshot is taken, it will be saved directly to the user’s gallery, making it easy to keep memories or share their pet’s cutest moments with friends. This feature enhances user engagement by letting them personalize and capture their favorite interactions with the pet.
On the left side of the screenshot, you can see the GameObject hierarchy, where a new ScreenshotCamera is created. Its Transform settings are adjusted to match the pet’s view, ensuring it captures the correct angle. The Camera component is configured with the appropriate projection, clipping planes, and render settings to ensure the pet and environment are properly rendered.
Additionally, the UI Canvas is set up with toggles to control whether the UI is included in the screenshot. This setup makes it possible to capture both UI-inclusive shots and clean shots without any overlays, giving the user flexibility when saving moments of their AR pet.
In this step, UI elements are created to allow the user to trigger the screenshot function and choose whether to capture the image with or without UI overlays.
On the left side of the image, you can see a UI Panel with buttons labeled Save and Share. These buttons are connected to scripts that will handle the screenshot capture, save the image to the gallery, or share it. The Canvas Inspector on the right shows that these buttons are linked to specific scripts and functions to execute the capture when pressed.
The ScreenshotManager script is also assigned in this step. This script contains methods to render the scene from the screenshot camera, hide the UI when needed, and save the captured image. By organizing the UI and logic here, users will have a smooth experience capturing and saving moments of their AR pet in different modes.
| Fig 1.7 Adding UI & Capture Controls |
In this step, the screenshot functionality is fully integrated and tested. The UI buttons are linked to the final ScreenshotManager methods, which handle capturing the screen, saving the image to the gallery, and toggling between showing or hiding the UI.
The inspector view confirms that the button onClick events are properly set up to call functions such as CaptureWithUI() or CaptureWithoutUI(). The Game window preview shows how the screenshot will look in each mode. By pressing the buttons, users can instantly take a snapshot of their pet in AR, either keeping the UI visible or hiding it for a cleaner look.
| Fig 1.8 Testing Screenshot Features |
The script begins by defining references to the camera, UI elements, and settings for screenshot capture. For example:
These variables manage which camera will be used for rendering, the UI feedback after taking a screenshot, and whether UI elements should be shown during capture.
The core functionality is split into multiple methods. CaptureWithUI() and CaptureWithoutUI() toggle the UI visibility and call ScreenshotToTexture(), which handles the actual capture process using a RenderTexture and saving it to a Texture2D. For instance:
This ensures the screenshot is taken cleanly without UI elements.
The SaveScreenshot() method writes the image to persistent storage and uses a timestamp-based file name for uniqueness:
Finally,
HideFeedback() deactivates the confirmation text after saving. Together, these functions allow users to capture AR pet moments with or without UI overlays, preview feedback on screen, and save the result to their gallery, making the feature user-friendly and versatile.The final step ensures that the screenshot feature is fully functional and polished. Here, the Unity Inspector shows that the button click events are properly linked to the screenshot logic. The ScreenshotManager component is assigned with references to the capture camera, render texture, and UI toggling options.
The Game window preview verifies that pressing the button successfully captures the AR pet with or without the UI, depending on the selected option. The screenshot is then saved to the device gallery, allowing users to share or keep their captured moments.
At this stage, the feature is tested to ensure that all buttons work correctly, the UI toggling behaves as intended, and the saved images retain the correct resolution and framing. This completes the screenshot system, giving users a seamless way to capture and save memories of their AR pet in different modes.
Working on the Screenshot Feature – Capture The Moment came with several challenges. One of the biggest issues was handling screenshots with and without UI elements. By default, Unity captures the entire screen, including overlays, so I needed to figure out how to temporarily hide the UI without breaking the user experience. I solved this by toggling the CanvasGroup.alpha value before and after capturing the screenshot, as seen in CaptureWithoutUI(), ensuring a clean capture without permanently disabling the UI.
Another challenge was saving the screenshot to the device gallery with unique filenames. Initially, screenshots would overwrite each other, which made it hard to manage saved images. I implemented a solution in SaveScreenshot(), where I generated timestamp-based filenames using System.DateTime.Now.ToString("yyyy_MM_dd_HH_mm_ss"). This ensured that every screenshot had a unique identifier, preventing accidental overwrites.
Lastly, optimizing user feedback after taking a screenshot was crucial. Without visual confirmation, users wouldn’t know if the screenshot was successful. To address this, I added a simple feedback text that appears after capture and automatically hides using the HideFeedback() method. These solutions together made the screenshot feature functional, intuitive, and user-friendly, ensuring players could easily save memorable moments of their AR pet.
2. Feedback
3. Reflection
For the screenshot feature, the main challenge was capturing the AR scene with and without UI while ensuring the image saved properly to the gallery. I solved this by using a RenderTexture-based approach to capture the camera output, combined with logic to temporarily hide UI elements before taking the screenshot. Implementing a toggle for UI visibility made it possible for users to choose their preferred screenshot mode.
The particle system was difficult because I had to convert static images into dynamic effects and adjust parameters like lifetime, speed, and size to fit each interaction. By learning how to configure textures, shaders, and emission properties in Unity, I built an EffectManager script to modularize all effects and trigger them in the right context – hearts for eating, bubbles for potion, sparkles for playing, and Zzz for sleeping.
For the potion crafting system, the biggest hurdle was making draggable ingredients work intuitively in AR. I created DraggableIngredient and DraggablePotion scripts with raycasting logic to track touch input and detect when objects were dropped into the cauldron. This was linked to the CauldronManager, which handled potion brewing and spawning the final potion.
Lastly, the sleep mode feature required UI animations and music toggling. The challenge was syncing the pet animation state with the music player while allowing users to toggle the ambient music. I solved this by building a MusicToggleButton script that controlled both the audio source and UI feedback.
Overall, the key solutions involved breaking each problem into smaller parts, writing modular scripts (like EffectManager, CauldronManager, and Draggable scripts), and integrating animations, UI, and logic cohesively. Team collaboration and Mr. Razif’s feedback helped refine these features, especially in polishing the prototype with particle effects and sound for a more immersive experience.
Observation :
Through this process, I realised how important teamwork and constant communication were in completing such a complex project. My partner ( Guo Ying ) and I continuously updated each other, shared solutions, and tested each other’s work to ensure all features integrated well. Feedback from Mr. Razif also helped us refine the prototype, especially by adding particle effects and sound to make the app feel more polished. Good collaboration made problem-solving easier and kept us motivated even when we were stuck.

Comments
Post a Comment