Experiential Design - Final Project

⭐ 23/7/2024 - 21/8/2025 (Week 9 - Week 14)
🎀 Yan Zhi Xuan | 0369425 
💜 Experiential Design | Bachelor of Design (Hons) in Creative Media | Taylor's University 
📚 Final Project



TABLE OF CONTENTS  /ᐠ - ˕ •マ ⋆。°✩




1. Links To Tasks ⊹ ࣪ ˖₊˚⋆˙⟡

Experiential Design - Task 1: Trending Experience
Experiential Design - Task 3: Project MVP Prototype
Experiential Design - Final Project



2. INSTRUCTIONS ⊹ ࣪ ˖₊˚⋆˙⟡

Fig. 2.1 Module Information Booklet - Experiential Design.

Final Project & E-portfolio (Week 09–14, 40%):

  • Finalize app with visual and functional polish.
  • Reflect on personal/team growth.
  • Submit app files, video, and blog reflection.



3. FINAL PROJECT ⊹ ࣪ ˖₊˚⋆˙⟡

Week 12

Further Refinement of Prototype

GreenLens AR App Wireframes and Mockups

Figma Link: Draft Final Outcome

Fig. 3.1 GreenLens AR App Wireframes and Mockups (Figma).


Old Design VS New Design: UI Color Refinement & Visual Improvements

Fig. 3.2 Color Changes.

In Week 11, we focused on enhancing the visual consistency and aesthetic appeal of our app interface. One of the main changes was updating the color palette across key pages. We transitioned from a solid darker green to a softer, gradient green to give the interface a fresher, more modern look. Additionally, we adjusted the brand logo’s subtitle text color to black to improve readability against the lighter background. For the launching pages, we added subtle drop shadows under the mascot icon to create depth and visual hierarchy. On the Home screen, we refined the background color to match the updated green theme for a more cohesive experience. These UI changes were driven by user feedback and our own observations, aiming to make the app feel lighter, friendlier, and easier to navigate visually.


Fig. 3.3 Changes of AR Scene Changes.

In addition to color updates, we made significant UI and UX refinements in the AR Scan scene. Previously, the recycling tips and material information were static and cluttered the interface. To improve user focus and provide clearer guidance, we introduced a new visual cue: a centered scanning reticle with the prompt “Find an item to scan,” which appears when no object is detected. Once an object is scanned, the UI dynamically updates to display relevant recycling information, including the material type, item name, and a checklist of recycling tips, along with a smoother layout. We also redesigned the summary card using a more structured layout, making it easier to read and visually appealing. These changes aimed to reduce cognitive overload, guide user interaction, and ensure that the scanning and sorting process feels more responsive and intuitive. The “Add to Recycle” and “Find the Nearest Recycle Bin” buttons were repositioned and restyled to improve their accessibility and call-to-action visibility.

Fig. 3.4 Changes of Bin Finder Page and Recycling Summary Page.

We also refined the Bin Finder and Recycling Summary pages to improve both clarity and interactivity. Initially, the map simply displayed nearby bins without any feedback. In the updated version, we added a friendly location tracking popup with a visual icon and Yes/No options, making it more engaging for users when enabling GPS. After tapping the bin location, users are now shown a real-world photo of the bin to confirm they’ve found the right one—this boosts confidence and usability, especially in unfamiliar areas. On the Recycling Summary page, we revised the layout and visual hierarchy by enlarging the earned, recycled, and saved data stats, and added a cute cheering icon to reinforce positive reinforcement. The buttons were also restyled for consistency and easier navigation. These enhancements collectively contribute to a smoother, more rewarding user journey from start to finish.


Fig. 3.5 Changes of Game Scene.

The biggest change came in the Game Scene, where we revamped the instructions and feedback panels, added a visible cursor to improve interaction clarity, and created separate pop-ups for game pause, right sorting, and wrong sorting. Each UI panel was redesigned with better spacing, shadows, and clear CTA buttons to guide users smoothly through the experience. Overall, these refinements greatly improved both the visual identity and user journey of our recycling assistant app.


Fig. 3.6 Change Model Target to Image Target.

For the development of GreenLens, we built the AR experience using Unity 6 together with the Vuforia Engine. Initially, in Task 3, we experimented with using Model Targets to recognize 3D objects like dropper bottles and books. While this approach worked in theory, in practice, we encountered a few issues. When we scanned the items, the recognition would flicker — making the AR content unstable. On top of that, the child objects attached to the Model Target would not display properly, which interrupted the user experience.

Because of these problems, we later decided to switch to using Image Targets instead. As shown here, we created image targets for common recyclable items like orange bins, glass bottles, notebooks, and even branded items like Coca-Cola. These image targets provided more stable tracking and smoother AR performance. It also made the interaction easier for users, especially younger audiences, which aligns with our goal of making the app both educational and accessible.



Week 13-15

Final Project Work Progress

Fig. 3.7 Cannot zip the Unity working file to my teammate to do the IOS build file.

While preparing to transfer the Unity project for iOS build, we encountered an error during zipping:
“File not found or no read permission.” This issue prevented us from sharing the GreenLens Unity project. Due to file permission restrictions or a missing asset during compression, we were unable to package the project folder successfully.

Next Step: Restarting the Project

As a result, we had no choice but to recreate the Unity project from scratch. In the following sections, we’ll walk you through our Unity working progress from the beginning. This unexpected obstacle gave us a chance to clean up our workflow and optimize certain parts of the experience as we rebuilt it.


Loading Scene

Fig. 3.8 Animation of Brand Logo in Loading Scene.

We began by setting up the Loading Scene, which includes a soft and light green gradient background and our GreenLens brand logo. Using Unity’s built-in Animation system, we created a smooth logo rotation animation to introduce the app in a friendly and engaging way. This forms the first impression for users and sets the tone for an eco-friendly digital experience.

Fig. 3.9 AutoSceneLoader Script.

To streamline the user experience, we added an Auto Scene Loader script in our LoadingScene. This script is attached to the ScreenLoader GameObject. Its purpose is to automatically transition from the loading screen to the next scene — in our case, LaunchingA — after a short delay (set to 5 seconds).

🔧 How it works:

  • Once the app starts, the Start() function runs and initiates a coroutine.
  • The coroutine uses WaitForSeconds(delay) to hold the screen for a few seconds.
  • After the delay, it triggers SceneManager.LoadScene(nextSceneName) to move to the next scene smoothly.

This gives our brand animation a few seconds to play before switching scenes, creating a more polished and professional feel when launching the app.


Launching Scene

Fig. 3.10 Launching A & Mascot Icon Animation in Launching Scene.

Our next scene, LaunchingA, introduces the app’s main feature—Scan & Sort. A playful mascot icon (animated with up-down hover movement) is paired with a simple description and a "Next" button.


Fig. 3.11 Script - MySceneManager.

The button uses a universal MySceneManager.cs script, allowing us to reuse the same script across all scenes. It handles the gotoScene("sceneName") function cleanly and efficiently, improving modularity throughout the app.

Fig. 3.12 Launching B & C.

To maintain a consistent and engaging user experience, we applied the same hover animation from Launching A to both Launching B and Launching C. These scenes highlight the app’s other core features:

  • Scan & Sort
  • Learn & Recycle
  • Find Nearby Bins

Each screen includes:

  • A lively animated mascot icon
  • Descriptive text for the feature
  • A Next button using the same universal gotoScene() method

This modular setup allowed us to reuse scripts and animation clips efficiently, reducing time and errors during development. By standardizing scene transitions and animation logic, we kept the app experience smooth, cohesive, and easy to expand.


Home Scene & Study Scene

Fig. 3.13 Home Page and Study Page.

We redesigned the Home Page to create a friendlier and more personalized user experience. 

  • User Profile Avatar
  • Recycling Stats Three key metrics are shown to encourage eco-friendly habits: Earned (rewards), Recycled (number of items), and Saved (kg) (estimated CO₂ reduction).
  • Scan Item Button: Positioned centrally to encourage immediate action, this button directs users to the AR scanner.
  • 🧭 Navigation Bar: A bottom navigation bar allows quick access to: Home, Game, Scan, Bins (Map) and Settings

The Study Page helps users understand the environmental impact of recycling. Key components:

  • CO₂ Savings Table
    Lists estimated CO₂ savings per 1kg of recycled material, including: Metal, Plastic, Paper, Glass, Steel

  • Calculation Formula
    A simple formula is provided for users to estimate how much CO₂ they can save:

    CO₂ Saved = Weight of Material × CO₂ Saved per kg
  • Side-by-Side Layout
    Optimized for scrollable comparison on mobile, this section presents two panels—one showing data and the other showing how to interpret it with an example.


AR Scene

Fig. 3.14 AR Scene Page.

This scene enables users to interact with physical waste items using AR recognition.

  • Scan Hint UI:
    On entering the scene, users are prompted with a message to “Find an item to scan.” This guide disappears permanently once a recyclable object (plastic, paper, or glass) is detected via Vuforia image targets.

  • AR Recognition System:
    Built using Vuforia’s ObserverBehaviour, each target triggers contextual InfoCards with recycling tips such as: How to sort glass, paper, and plastic. Visual cues for proper disposal bin (e.g., Brown Bin for glass).

  • UI Elements:

    1. App Bar with title Live AR Camera Feed
    2. ‘Add to Recycle’ and ‘Next’ buttons
    3. Real-time recycling counter (Recycle Items: 0)

        Fig. 3.15 Script - ScanHintToggle.

        Script Logic: A custom script ScanHintToggle.cs ensures the scanning hint only shows once per session. It uses OnTargetStatusChanged to check if any tracked image is detected and disables the hint UI.

        Fig. 3.16 Script - RecycleCounter.

        Add to Recycle Button:
        Clicking this button:

        • Increments a recycling counter, updated via the RecycleCounter.cs script.
        • Displays the count as: Recycle Items: X
        • Plays a pop sound effect for feedback.
        • The counter simulates tracking user recycling behavior but doesn't yet log specific item types—this keeps the experience light and beginner-friendly.


        Fig. 3.17 Image Target - Plastic.

        Fig. 3.18 Image Target - Paper.

        Fig. 3.19 Image Target - Glass.

        Real-Time Object Detection:
        Uses Vuforia image targets for different materials like:

        • Plastic 
        • Paper
        • Galss

        Each target triggers an Info Card slide-in animation with:

        • Material & item type (e.g., Plastic – Water Bottle)
        • Recycling tips (e.g., remove label, flatten)
        • Bin recommendation (e.g., Orange Bin for plastic)
        • Environmental impact (e.g., CO₂ saved, energy earned)

        AR Scene Manager: Ensures stable tracking and toggling of AR components (tips, items, bins). The same animation logic is reused across Plastic, Paper, and Glass for consistency.


        User Progression in AR Scene

        After scanning and adding an item, users press “Next” to proceed in the game journey.


        Fig. 3.20 Recycle Items Scroll View.

        Fig. 3.21 DOTween (HOTween v2).

        After completing the Recycle Counter and scrollable info panels for detected items, we implemented a horizontally scrollable view using the PageNav.cs script. This script enables DOTween-based smooth panel transitions when clicking the Next and Previous buttons. The ARScene2 UI dynamically updates with item names and materials (e.g., “Water Bottle - Plastic”) as part of the recycling guidance. DOTween was downloaded and integrated into Unity for cleaner page movement animations between panels. Each panel displays an AR-scanned item and its corresponding bin color, allowing users to navigate their recycling history interactively.


        Fig. 3.21 Previous Button.

        Fig. 3.22 Next Button.

        Scroll View Panels: Panel1, Panel2, Panel3 under Content

        Navigation Buttons: NextBtn and PreviousBtn

        Script: PageNav.cs correctly controls horizontal scroll via DOTween

        Functionality:

        • goNext() moves content left by panelWidth (to show next panel)
        • goPrev() moves content right by panelWidth (to show previous panel)
        UI Binding:Buttons are correctly linked to PageNav.goNext() and PageNav.goPrev() in Inspector

        Animation: DOTween works smoothly, already compiled and linked.



        Bin Finder Scene

        Fig. 3.23 Bin Finder Page.

        On the Bin Finder page, we implemented a functional search bar that allows users to search for recycling locations. The UI includes a simple and user-friendly input field designed with rounded corners and a transparent background to match the overall aesthetic.

        We used TMP_InputField from TextMeshPro for better text rendering and styling control. To make the search interactive, we attached a custom script called SearchBarReader.cs, which reads user input when the Enter key (Return) is pressed.


        Fig. 3.24 Search Input in Search Bar Script - SearchBarReader.

        This script logs the search term in the console for testing purposes:

        if (Input.GetKeyDown(KeyCode.Return)) { Debug.Log("You typed: " + inputField.text); }

        This allows us to validate that the search input is being captured correctly. Although it's currently a basic prototype, this sets the foundation for a full search function that can filter or highlight nearby bins on the map based on the user's input. The search bar is positioned clearly at the top of the map view, making it easy for users to locate and use.

        Fig. 3.25 YesBtn in Track Panel Script - TrackPanelHider.

        On the Bin Finder page, we designed a simple and interactive user flow to guide users through locating nearby recycling bins. When the screen loads, a Track Panel appears, prompting the user with a friendly message to track their location. By tapping the "Yes" button, the panel smoothly closes, creating a clean and focused interface. This is handled using the TrackPanelHider.cs script, which deactivates the panel once the user confirms.

        Fig. 3.26 PinBtn in Map Panel Script - PopupController.

        After that, the user can tap the green pin button, which reveals a real-life image of the bin’s actual location. This feature is powered by the PopupController.cs script, which also includes a sound effect for a more responsive experience. To prevent repeated pop-ups, we included a condition that ensures the image only shows once.


        Fig. 3.27 Summary Page.

        Once the user finds the bin and taps the "Next" button, the screen transitions to a Recycling Summary. This summary celebrates the user’s effort by displaying how many coins they earned, how many items were recycled, and how much CO₂ they helped save. A cheerful message also appears, letting the user know they’ve saved enough energy to light an LED bulb for six hours—making recycling not just easy, but rewarding too.



        Game Scene

        Fig. 3.28 Image Target Bin in Game Scene.

        In our Game Scene, we implemented an Image Target Bin using Vuforia’s AR technology to enhance the interactive experience. At the start, users are presented with a clear instruction panel that explains how to sort waste correctly into four color-coded categories: Blue for paper, Orange for plastic and metal, Brown for glass, and Black for general waste. Once the user taps the "Start Game" button, they are guided to scan the real-world image target of a recycling bin. When the image is successfully detected by the AR camera, a 3D model of the bin appears. This dynamic element creates an immersive environment where users can drag the correct waste item and drop them into the drop zone. The use of image targets makes the game more engaging and educational, helping users understand proper recycling practices through a hands-on experience. The pause, resume, and info buttons also provide additional support and flexibility during gameplay.


        Fig. 3.29 Script - AudioManager.

        To enhance the gameplay experience, we implemented an Audio Manager that handles both background music and sound effects. Using the AudioManager.cs script, we created methods to control the background music—allowing it to play, pause, or stop based on user interaction, such as entering the game or pausing it. We also added specific sound effects for key actions: dragging an item, dropping it, and giving correct or wrong feedback during sorting. These sound cues are triggered using PlayOneShot() for immediate playback, improving both feedback clarity and immersion. 


        Fig. 3.30 Script - UICursorFollow.

        Additionally, we customized the UI cursor using a UICursorFollow.cs script. This replaces the default system cursor with a stylized in-game cursor that smoothly follows the mouse position. We hid the default cursor and made sure the new one stays within the game window using Unity’s Cursor.lockState. This custom cursor adds a polished touch to the interface, making the drag-and-drop experience feel more responsive and visually cohesive.

        Fig. 3.31 Script - BinDrpZone.

        In our recycling mini-game, we implemented a drag-and-drop system that checks whether users sort waste correctly. This system is made up of three key components: the draggable items, the drop zones, and the game manager. Each draggable item (like a bottle or cake) is assigned a specific tag (e.g., "Orange" for plastic/metal), which is compared against the accepted tag of a drop zone bin. The DraggableItem.cs script controls how each item behaves when picked up, moved, and released.

        Fig. 3.32 Script - DraggableItem.

        It also communicates with the AudioManager to play drag and drop sounds for a more interactive feel. The BinDropZone.cs script listens for when an item is dropped onto the bin and checks if the tags match.

        Fig. 3.33 Script - GameManager.

        If the item is correctly sorted, it triggers the CorrectDrop() method in GameManager.cs; otherwise, it triggers WrongDrop(). These methods handle the success or failure logic, such as activating feedback panels and playing the appropriate sound effect. The game manager also controls game flow—starting, pausing, resuming, and restarting the game—ensuring smooth transitions between different states while managing background music and interaction feedback.




        Quick Recap of GreenLens, AR Recycling Sorter App

        Fig. 3.4 Project Introduction Recap - About GreenLens.

        Here’s a quick recap

        GreenLens, our AR Recycling Sorter. GreenLens is designed to help people view the world through an eco-conscious lens, quite literally, using Augmented Reality. The main goal of this project is to simplify and improve how households handle recycling by using AR to provide real-time visual feedback.
        Through this engaging experience, users can quickly learn how to sort their waste correctly and more sustainably.


        Fig. 3.34 AR App Concept Overview.

        Our app is structured around three core features:

        • Scan & Sort: Users can scan an item using AR and receive a visual suggestion, showing which bin it belongs in.

        • Learn & Recycle: Alongside scanning, users can also view useful recycling tips specific to different types of items.

        • Find Nearby Bins: The app will also use location data to guide users to nearby recycling bins, helping them act on the guidance easily.


        Fig. 3.35 Aim & Objectives.

        We aim to encourage better recycling habits by making the process easier, more educational, and fun. We’ve outlined three main objectives:

        1. First, to enable users to scan household items and receive instant AR guidance on which recycling bin to use.

        2. Second, to educate users about item-specific recycling practices using visual tips and cues.

        3. And third, to promote long-term sustainable habits by designing an experience that’s simple, engaging, and accessible to everyone.


        Fig. 3.36 Target Audience,

        Our target audience includes a range of everyday users.

        • Eco-conscious young adults who are already mindful of sustainability but want smarter tools to support their lifestyle.

        • Teachers with students, especially in classrooms that focus on environmental education.

        • Parents with children, who can use this tool as a fun and interactive way to teach kids about recycling.

        • And finally, the general public, since our app is designed to be easy for anyone to use regardless of age or background.”


        Fig. 3.37 Conclusion.

        Summary

        Loading and Launch Pages: The app opens with a clean loading screen featuring our brand icon. Then, users are guided through three launch screens — Scan & Sort, Learn and Recycle, and Find Nearby Bins — which briefly explain the core features of the app in a friendly and simple way. 

        Home Page: On the home screen, users can see how many coins they’ve earned, how many items they’ve recycled, and how much CO₂ they’ve saved. This gives a quick sense of personal environmental impact.

        Scan Interface: Users can scan an item, and our AR system identifies the material and guides them to the correct bin with visuals and instructions.”

        Recycling Tips Page: After scanning, the app shows item-specific recycling tips — like whether to remove labels, flatten bottles, or which bin color to use — ensuring that users recycle correctly.

        Bin Finder Page: Bin Finder uses location services to help users find the nearest recycling bins, matched by the user’s location.

        Recycling Summary Page: Once recycling is completed, users are shown a summary screen with stats: total items recycled, coins earned, and CO₂ saved — turning eco-actions into trackable progress.

        Game Pages: In the gamified mode, users are instructed to drag the correct waste into the bins. If they make a mistake, they get feedback like ‘Oops! Wrong Item.’ If they get it right, they’re rewarded with a cheerful animation saying ‘You sorted it right!’ — encouraging continuous engagement.


        Fig. 3.38 GreenLens App.

        💜 Special Thanks 💜

        We would like to extend our heartfelt thanks to Mr. Razif and Ms. Anis for their continuous support, guidance, and patience throughout the development of our project. Mr. Razif's valuable feedback during consultations helped us improve our ideas and push our boundaries creatively. During a difficult moment when technical issues disrupted our progress, they showed great understanding and gave us the space and time we needed to recover and complete our work. We’re truly grateful for their dedication and encouragement — it made a huge difference in our learning experience and motivated us to give our best. Thank you for believing in us!



        ⭐ Final Submission

        1. Final Project Walkthrough Presentation Video with Slide Presentation: Google Drive / YouTube
        1. Only Walkthrough Presentation Video: Google Drive / YouTube
        2. Presentation Slide: Canva Slide
        3. Google Drive folder: Link (includes Image Targets, Unity Project zip file, IOS build file)


        GreenLens - Walkthrough Presentation Video with Slide Presentation

        Fig. 3.39 GreenLens Slide Presentation + Walkthrough Presentation Video.
          *** While watching the video, please select 1080p (HD) for clear quality view.

        GreenLens - Walkthrough Presentation Video

        Fig. 3.40 GreenLens Walkthrough Presentation Video.
          *** While watching the video, please select 1080p (HD) for clear quality view.

        GreenLens Final Project Presentation Slide



        4. FEEDBACK   /ᐠ - ˕ •マ

        Mr. Razif mentioned that if the model target isn’t scanning properly, we can consider using an image target instead. He also pointed out that it’s good we’ve tried different methods to figure out the best approach for building the AR scene.



        5. REFLECTIONS    /ᐠ - ˕ •マ

        The final project felt like the most rewarding part of this module. We took all the research, proposal insights, and MVP feedback and refined them into a fully developed, functional product. It was satisfying to see everything come together, but it also required a lot of problem-solving, feedback gathering, and detailed attention.

        We worked hard to improve the visual assets and user flow, making sure the app not only looked good but worked smoothly. We revised the interface multiple times based on usability considerations and the feedback we got from informal testing. This iterative design process taught me the importance of prototyping, testing, and refining again—and again.

        What made this project especially meaningful was the E-portfolio reflection. I didn’t just reflect on the project itself—I reflected on myself as a designer. I explored how I managed teamwork, responded to challenges, took feedback, and maintained motivation. Writing about how I empathized with users and communicated with team members helped me connect design to real human experiences.

        Through this project, I saw my own growth—not just in skill, but in mindset. I became more resilient, more detail-oriented, and more confident in articulating my design thinking. This task made me realize that good design isn’t just about visuals or features; it’s about purpose, empathy, and impact.

        Comments

        Popular posts from this blog

        Design Principles - Task 3: Design

        Design Principles - Final Compilation

        Information Design - Project 1 & 2