Experiential Design - Final Project
Final Project & E-portfolio (Week 09–14, 40%):
- Finalize app with visual and functional polish.
- Reflect on personal/team growth.
- Submit app files, video, and blog reflection.
In Week 11, we focused on enhancing the visual consistency and aesthetic appeal of our app interface. One of the main changes was updating the color palette across key pages. We transitioned from a solid darker green to a softer, gradient green to give the interface a fresher, more modern look. Additionally, we adjusted the brand logo’s subtitle text color to black to improve readability against the lighter background. For the launching pages, we added subtle drop shadows under the mascot icon to create depth and visual hierarchy. On the Home screen, we refined the background color to match the updated green theme for a more cohesive experience. These UI changes were driven by user feedback and our own observations, aiming to make the app feel lighter, friendlier, and easier to navigate visually.
For the development of GreenLens, we built the AR experience using Unity 6 together with the Vuforia Engine. Initially, in Task 3, we experimented with using Model Targets to recognize 3D objects like dropper bottles and books. While this approach worked in theory, in practice, we encountered a few issues. When we scanned the items, the recognition would flicker — making the AR content unstable. On top of that, the child objects attached to the Model Target would not display properly, which interrupted the user experience.
Because of these problems, we later decided to switch to using Image Targets instead. As shown here, we created image targets for common recyclable items like orange bins, glass bottles, notebooks, and even branded items like Coca-Cola. These image targets provided more stable tracking and smoother AR performance. It also made the interaction easier for users, especially younger audiences, which aligns with our goal of making the app both educational and accessible.
While preparing to transfer the Unity project for iOS build, we encountered an error during zipping:
“File not found or no read permission.” This issue prevented us from sharing the GreenLens Unity project. Due to file permission restrictions or a missing asset during compression, we were unable to package the project folder successfully.
Next Step: Restarting the Project
As a result, we had no choice but to recreate the Unity project from scratch. In the following sections, we’ll walk you through our Unity working progress from the beginning. This unexpected obstacle gave us a chance to clean up our workflow and optimize certain parts of the experience as we rebuilt it.
Loading Scene
To streamline the user experience, we added an Auto Scene Loader script in our LoadingScene. This script is attached to the ScreenLoader GameObject. Its purpose is to automatically transition from the loading screen to the next scene — in our case, LaunchingA — after a short delay (set to 5 seconds).
🔧 How it works:
- Once the app starts, the
Start()function runs and initiates a coroutine. - The coroutine uses
WaitForSeconds(delay)to hold the screen for a few seconds. - After the delay, it triggers
SceneManager.LoadScene(nextSceneName)to move to the next scene smoothly.
This gives our brand animation a few seconds to play before switching scenes, creating a more polished and professional feel when launching the app.
Our next scene, LaunchingA, introduces the app’s main feature—Scan & Sort. A playful mascot icon (animated with up-down hover movement) is paired with a simple description and a "Next" button.
MySceneManager.cs script, allowing us to reuse the same script across all scenes. It handles the gotoScene("sceneName") function cleanly and efficiently, improving modularity throughout the app.To maintain a consistent and engaging user experience, we applied the same hover animation from Launching A to both Launching B and Launching C. These scenes highlight the app’s other core features:
- Scan & Sort
- Learn & Recycle
- Find Nearby Bins
Each screen includes:
- A lively animated mascot icon
- Descriptive text for the feature
- A Next button using the same universal
gotoScene()method
This modular setup allowed us to reuse scripts and animation clips efficiently, reducing time and errors during development. By standardizing scene transitions and animation logic, we kept the app experience smooth, cohesive, and easy to expand.
We redesigned the Home Page to create a friendlier and more personalized user experience.
- User Profile Avatar
- Recycling Stats Three key metrics are shown to encourage eco-friendly habits: Earned (rewards), Recycled (number of items), and Saved (kg) (estimated CO₂ reduction).
- Scan Item Button: Positioned centrally to encourage immediate action, this button directs users to the AR scanner.
- 🧭 Navigation Bar: A bottom navigation bar allows quick access to: Home, Game, Scan, Bins (Map) and Settings
The Study Page helps users understand the environmental impact of recycling. Key components:
-
CO₂ Savings Table
Lists estimated CO₂ savings per 1kg of recycled material, including: Metal, Plastic, Paper, Glass, Steel -
Calculation Formula
A simple formula is provided for users to estimate how much CO₂ they can save: -
Side-by-Side Layout
Optimized for scrollable comparison on mobile, this section presents two panels—one showing data and the other showing how to interpret it with an example.
AR Scene
This scene enables users to interact with physical waste items using AR recognition.
Scan Hint UI:
On entering the scene, users are prompted with a message to “Find an item to scan.” This guide disappears permanently once a recyclable object (plastic, paper, or glass) is detected via Vuforia image targets.AR Recognition System:
Built using Vuforia’sObserverBehaviour, each target triggers contextual InfoCards with recycling tips such as: How to sort glass, paper, and plastic. Visual cues for proper disposal bin (e.g., Brown Bin for glass).
UI Elements:
- App Bar with title Live AR Camera Feed
- ‘Add to Recycle’ and ‘Next’ buttons
- Real-time recycling counter (
Recycle Items: 0)
ScanHintToggle.cs ensures the scanning hint only shows once per session. It uses OnTargetStatusChanged to check if any tracked image is detected and disables the hint UI.Add to Recycle Button:
Clicking this button:
- Increments a recycling counter, updated via the
RecycleCounter.csscript. - Displays the count as:
Recycle Items: X - Plays a pop sound effect for feedback.
- The counter simulates tracking user recycling behavior but doesn't yet log specific item types—this keeps the experience light and beginner-friendly.
Real-Time Object Detection:
Uses Vuforia image targets for different materials like:
- Plastic
- Paper
- Galss
Each target triggers an Info Card slide-in animation with:
- Material & item type (e.g., Plastic – Water Bottle)
- Recycling tips (e.g., remove label, flatten)
- Bin recommendation (e.g., Orange Bin for plastic)
- Environmental impact (e.g., CO₂ saved, energy earned)
PageNav.cs script. This script enables DOTween-based smooth panel transitions when clicking the Next and Previous buttons. The ARScene2 UI dynamically updates with item names and materials (e.g., “Water Bottle - Plastic”) as part of the recycling guidance. DOTween was downloaded and integrated into Unity for cleaner page movement animations between panels. Each panel displays an AR-scanned item and its corresponding bin color, allowing users to navigate their recycling history interactively.Scroll View Panels: Panel1, Panel2, Panel3 under Content
Navigation Buttons: NextBtn and PreviousBtn
Script: PageNav.cs correctly controls horizontal scroll via DOTween
Functionality:
goNext()moves content left bypanelWidth(to show next panel)goPrev()moves content right bypanelWidth(to show previous panel)
PageNav.goNext() and PageNav.goPrev() in InspectorAnimation: DOTween works smoothly, already compiled and linked.
On the Bin Finder page, we implemented a functional search bar that allows users to search for recycling locations. The UI includes a simple and user-friendly input field designed with rounded corners and a transparent background to match the overall aesthetic.
We used TMP_InputField from TextMeshPro for better text rendering and styling control. To make the search interactive, we attached a custom script called SearchBarReader.cs, which reads user input when the Enter key (Return) is pressed.
This script logs the search term in the console for testing purposes:
if (Input.GetKeyDown(KeyCode.Return)) { Debug.Log("You typed: " + inputField.text); }
This allows us to validate that the search input is being captured correctly. Although it's currently a basic prototype, this sets the foundation for a full search function that can filter or highlight nearby bins on the map based on the user's input. The search bar is positioned clearly at the top of the map view, making it easy for users to locate and use.
TrackPanelHider.cs script, which deactivates the panel once the user confirms. PopupController.cs script, which also includes a sound effect for a more responsive experience. To prevent repeated pop-ups, we included a condition that ensures the image only shows once. Once the user finds the bin and taps the "Next" button, the screen transitions to a Recycling Summary. This summary celebrates the user’s effort by displaying how many coins they earned, how many items were recycled, and how much CO₂ they helped save. A cheerful message also appears, letting the user know they’ve saved enough energy to light an LED bulb for six hours—making recycling not just easy, but rewarding too.
In our Game Scene, we implemented an Image Target Bin using Vuforia’s AR technology to enhance the interactive experience. At the start, users are presented with a clear instruction panel that explains how to sort waste correctly into four color-coded categories: Blue for paper, Orange for plastic and metal, Brown for glass, and Black for general waste. Once the user taps the "Start Game" button, they are guided to scan the real-world image target of a recycling bin. When the image is successfully detected by the AR camera, a 3D model of the bin appears. This dynamic element creates an immersive environment where users can drag the correct waste item and drop them into the drop zone. The use of image targets makes the game more engaging and educational, helping users understand proper recycling practices through a hands-on experience. The pause, resume, and info buttons also provide additional support and flexibility during gameplay.
AudioManager.cs script, we created methods to control the background music—allowing it to play, pause, or stop based on user interaction, such as entering the game or pausing it. We also added specific sound effects for key actions: dragging an item, dropping it, and giving correct or wrong feedback during sorting. These sound cues are triggered using PlayOneShot() for immediate playback, improving both feedback clarity and immersion. UICursorFollow.cs script. This replaces the default system cursor with a stylized in-game cursor that smoothly follows the mouse position. We hid the default cursor and made sure the new one stays within the game window using Unity’s Cursor.lockState. This custom cursor adds a polished touch to the interface, making the drag-and-drop experience feel more responsive and visually cohesive.DraggableItem.cs script controls how each item behaves when picked up, moved, and released.AudioManager to play drag and drop sounds for a more interactive feel. The BinDropZone.cs script listens for when an item is dropped onto the bin and checks if the tags match. CorrectDrop() method in GameManager.cs; otherwise, it triggers WrongDrop(). These methods handle the success or failure logic, such as activating feedback panels and playing the appropriate sound effect. The game manager also controls game flow—starting, pausing, resuming, and restarting the game—ensuring smooth transitions between different states while managing background music and interaction feedback.Here’s a quick recap
GreenLens, our AR Recycling Sorter. GreenLens is designed to help people view the world through an eco-conscious lens, quite literally, using Augmented Reality. The main goal of this project is to simplify and improve how households handle recycling by using AR to provide real-time visual feedback.Through this engaging experience, users can quickly learn how to sort their waste correctly and more sustainably.
Our app is structured around three core features:
Scan & Sort: Users can scan an item using AR and receive a visual suggestion, showing which bin it belongs in.
Learn & Recycle: Alongside scanning, users can also view useful recycling tips specific to different types of items.
Find Nearby Bins: The app will also use location data to guide users to nearby recycling bins, helping them act on the guidance easily.
We aim to encourage better recycling habits by making the process easier, more educational, and fun. We’ve outlined three main objectives:
First, to enable users to scan household items and receive instant AR guidance on which recycling bin to use.
Second, to educate users about item-specific recycling practices using visual tips and cues.
And third, to promote long-term sustainable habits by designing an experience that’s simple, engaging, and accessible to everyone.
Our target audience includes a range of everyday users.
Eco-conscious young adults who are already mindful of sustainability but want smarter tools to support their lifestyle.
Teachers with students, especially in classrooms that focus on environmental education.
Parents with children, who can use this tool as a fun and interactive way to teach kids about recycling.
And finally, the general public, since our app is designed to be easy for anyone to use regardless of age or background.”
Summary
Loading and Launch Pages: The app opens with a clean loading screen featuring our brand icon. Then, users are guided through three launch screens — Scan & Sort, Learn and Recycle, and Find Nearby Bins — which briefly explain the core features of the app in a friendly and simple way.
Home Page: On the home screen, users can see how many coins they’ve earned, how many items they’ve recycled, and how much CO₂ they’ve saved. This gives a quick sense of personal environmental impact.
Scan Interface: Users can scan an item, and our AR system identifies the material and guides them to the correct bin with visuals and instructions.”
Recycling Tips Page: After scanning, the app shows item-specific recycling tips — like whether to remove labels, flatten bottles, or which bin color to use — ensuring that users recycle correctly.
Bin Finder Page: Bin Finder uses location services to help users find the nearest recycling bins, matched by the user’s location.
Recycling Summary Page: Once recycling is completed, users are shown a summary screen with stats: total items recycled, coins earned, and CO₂ saved — turning eco-actions into trackable progress.
Game Pages: In the gamified mode, users are instructed to drag the correct waste into the bins. If they make a mistake, they get feedback like ‘Oops! Wrong Item.’ If they get it right, they’re rewarded with a cheerful animation saying ‘You sorted it right!’ — encouraging continuous engagement.
We worked hard to improve the visual assets and user flow, making sure the app not only looked good but worked smoothly. We revised the interface multiple times based on usability considerations and the feedback we got from informal testing. This iterative design process taught me the importance of prototyping, testing, and refining again—and again.
What made this project especially meaningful was the E-portfolio reflection. I didn’t just reflect on the project itself—I reflected on myself as a designer. I explored how I managed teamwork, responded to challenges, took feedback, and maintained motivation. Writing about how I empathized with users and communicated with team members helped me connect design to real human experiences.
Through this project, I saw my own growth—not just in skill, but in mindset. I became more resilient, more detail-oriented, and more confident in articulating my design thinking. This task made me realize that good design isn’t just about visuals or features; it’s about purpose, empathy, and impact.



Comments
Post a Comment