Experiential Design - Task 3: Project MVP Prototype
- Use Canvas, Panel, Button, and Text objects under Unity’s UI system.
- Panels represent different "screens" or views (e.g., Menu, Info Page, AR Mode).
- Use OnClick() events in the Inspector to link buttons to functions or actions. Example: A “Play” button deactivates the menu panel and activates the AR scene panel.
- Toggle visibility of UI panels using SetActive(true/false) via attached C# scripts.Panels can be layered and shown/hidden to simulate screen transitions.
- Group all UI elements in a dedicated “UI Elements” folder for cleaner project structure.
- Each button and panel prefab can be reused across different scenes.
In Week 7, we learned how to export and deploy a Unity AR project to a mobile phone, with a focus on understanding the platform-specific requirements for both Android and iOS.
Exporting to Android (Windows or Mac):
- Can build and export Unity projects to Android devices using either Windows or Mac.
- Unity needs the Android Build Support module installed (with SDK, NDK, and JDK).
- Can only build iOS projects on a Mac. Windows systems cannot export directly to iPhone because they do not support Xcode, which is required for building and deploying to iOS devices.
- iOS projects must be exported as an Xcode project and then compiled and deployed using Xcode on a MacBook or other Apple device.
- After building, transfer the app file (APK for Android or Xcode project for iOS) to the device.
- For Android, you can test directly by copying the APK to the phone and installing it.
- For iOS, additional setup such as an Apple Developer account and provisioning profiles is required.
In Week 8, we learned how to work with the Vuforia Engine in Unity to create markerless AR experiences using tools like AR Camera, Plane Finder, and Plane Stage. These components allow users to place digital objects onto real-world surfaces by scanning the environment with their device camera.
After setting up the scene, we proceeded to learn how to build and run the AR project on an iPhone using Xcode. This included exporting the Unity project to Xcode, configuring iOS signing credentials, and using the iOS simulator or a physical device for testing. The process required setting a valid Bundle Identifier, enabling automatic signing, and selecting a team profile to generate the provisioning and signing certificate.
In Week 9, we learned how to create a virtual room environment in Unity using simple 3D objects such as cubes and planes to form walls, floors, and furniture. In class, we also focused on how to scale objects up and down accurately to match proportions and space. This involved adjusting the X, Y, and Z scale values in the Inspector to size each item properly within the room.
This practice helped us understand spatial relationships and object alignment, which are essential for designing believable environments in AR. The tutorial we were assigned to watch at home reinforced these skills by showing how to arrange and position elements to build a clean and functional virtual room.
In Week 9, we were tasked with creating a virtual room environment in Unity by following an assigned tutorial at home. The tutorial guided us through setting up a simple 3D room using primitive objects such as planes, cubes, and walls. We learned how to build a basic interior layout by adjusting the position, scale, and materials of each object, simulating the structure of a room.
Additional elements like lighting and camera angles were also covered to make the scene feel more realistic and immersive. This room can later serve as an interactive space within an AR experience, such as for interior design, object placement, or navigation tests.
"VideoPlane")—to trigger an action. In this case, the action was playing a video using Unity’s Video Player component.🧠 Key Concepts Covered:
-
Raycasting: Used to detect if the user is pointing at a specific object.
- Video Player Integration: Add a Video Player component to the VideoPlane object. Assign a
.mp4video clip. Use Raycast detection to play the video when the object is hit.
Task 3: MVP Prototype (Week 06–10, 20%):
- Build a functional app prototype (Figma).
- Test usability and key interactions.
- Submit video walkthrough + blog post.
Loading and Launch Pages: The app opens with a clean loading screen featuring our brand icon. Then, users are guided through three launch screens — Scan & Sort, Learn and Recycle, and Find Nearby Bins — which briefly explain the core features of the app in a friendly and simple way.
Home Page: On the home screen, users can see how many coins they’ve earned, how many items they’ve recycled, and how much CO₂ they’ve saved. This gives a quick sense of personal environmental impact.
Scan Interface: Users can scan an item, and our AR system identifies the material and guides them to the correct bin with visuals and instructions.”
Recycling Tips Page: After scanning, the app shows item-specific recycling tips — like whether to remove labels, flatten bottles, or which bin color to use — ensuring that users recycle correctly.
Bin Finder Page: Bin Finder uses location services to help users find the nearest recycling bins, matched by the user’s location.
Recycling Summary Page: Once recycling is completed, users are shown a summary screen with stats: total items recycled, coins earned, and CO₂ saved — turning eco-actions into trackable progress.
Game Pages: In the gamified mode, users are instructed to throw virtual waste into the correct bins. If they make a mistake, they get feedback like ‘Oops! Wrong bin.’ If they get it right, they’re rewarded with a cheerful animation saying ‘You sorted it right!’ — encouraging continuous engagement.We began our app development journey by designing a simple yet effective loading scene. It served as the first touchpoint between the user and the app, displaying the GreenLens logo and setting the tone for the experience ahead. We wanted it to feel quick and purposeful, not just a delay screen. Technically, this part was straightforward, but we encountered a small font rendering issue with Unicode characters in the Poppins-Medium SDF, which we resolved using fallback characters. Overall, it gave us a solid start and helped establish a clean design standard from the beginning.
The Loading Page is the introductory splash screen for the GreenLens application. It provides an initial branding experience before users are transitioned into the interactive sections of the app.
Purpose: To display the app logo and tagline briefly while backend assets initialize. Acts as a visual buffer to ensure a smooth transition to the Launching or Home scene.
UI Elements:
- Logo: A large, centered GreenLens logo with magnifier icon representing recycling assistance.
- Tagline: “Your Recycling Tutor Assistant” reinforces the app’s function and value.
- Background: Clean green tone, symbolizing sustainability and eco-friendliness.
- Canvas Setup: Uses Unity’s UI Canvas system and is layered above the camera and directional light.
Scene Transition: This page is loaded at app start and will call the SceneLoader.cs script to transition to the “Launching” scene after a short delay or animation sequence.
- Purpose: Briefly guides users to use their camera to scan items and receive AR-based sorting instructions.
- Message: “Use your camera to scan any item and get instant AR guidance on the correct recycling bin to use.”
- Interaction: Includes a “Next” button that transitions users to the next tutorial or directly into the Home Page/Scan mode.
- Panel > Background: Holds the main layout and colour background.
- Icon A–C: Animated or static illustrations representing the scanning process (recyclable bin with a magnifying glass).
- NextBtn: Interactive button component styled with TextMesh Pro, connected to navigation logic.
- Title & Body Text: Centre-aligned onboarding message styled with SDF fonts for crisp display.
The Home Page serves as the main dashboard of the GreenLens AR app. It welcomes users and provides quick insights into their recycling impact, as well as easy access to other core features.
Header Section:
- User Greeting: Personalized with the user’s name (e.g., “Hi, Yan”).
- Avatar/Character Icon: Adds a fun, gamified feel to the page.
Score Stats Panel:
- Coins Earned: Tracks gamified rewards earned through recycling.
- CO₂ Saved: Displays how much carbon dioxide (CO₂) has been reduced by recycling.
- Scan Activity Icon: A visual indicator tied to scan usage or progress.
Info Section:
- Text reads: “How much carbon dioxide (CO₂) emission you can avoid from recycling different materials?”
- Tapping the info button opens the Study Scene (as seen in the
OnClick()event in the Button component).
Scan Button:
-
Direct call-to-action: “Scan Item” button for initiating the item recognition feature.
Bottom Navigation Bar: Home | Game | Scan | Bin Finder | Settings
The Study Scene is an educational page designed to help users understand the science behind recycling and how it translates to carbon savings.
Features:
- Material Breakdown Table: Shows average CO₂ savings per kg of recycled material:
- Aluminum: ~10.9 kg CO₂ saved
- Plastic: ~3.5 kg CO₂
- Paper: ~4.3 kg CO₂
- Glass: ~0.3 kg CO₂
- Steel: ~1.2 kg CO₂
- Formula Section: CO₂ Saved = Weight of Material × CO₂ Factor (Includes a user-friendly example for better comprehension.)
- Navigation Bar: Includes access to other parts of the app (Home, Scan, etc.) for seamless movement between learning and action.
Summary Page
The summary scene gave us the opportunity to reinforce positive feedback and celebrate user achievements. After scanning or recycling an item, users see their updated stats and a motivational message—like saving enough energy to light an LED bulb. We focused on making the layout feel rewarding without overwhelming the user. The buttons to scan another item or return home were designed for seamless flow. This scene tied the experience together and made users feel that their actions had value. It was fulfilling to build something that could potentially boost eco-friendly habits.
The Summary Scene provides immediate feedback after a user scans or sorts an item, offering a sense of achievement and reinforcing positive behavior.
Features:
- Summary Stats: Displays key recycling metrics (Coins Earned, Items Recycled, CO₂ Saved)
- Achievement Message: Example: "You’ve saved enough energy to light an LED bulb for 6 hours!" This adds a tangible, real-world context to the environmental impact.
- Navigation Buttons:
- Scan Another Item – loops user back to the scanning scene.
- Back to Home – returns user to the dashboard/home page.
Canvas > SummaryStats and Box, with all values using TextMesh Pro for clean display.For Task 3, we haven’t started coding yet, but we focused on exploring 3D model compatibility and customization in Unity. Our main goal was to test different file formats (such as .fbx) and successfully import them into the scene. We experimented with changing the color of the bins and applying textures or icons like the recycling symbol. This hands-on process helped us understand how materials, shaders, and mesh renderers work in Unity. Although there's no gameplay logic yet, this stage was important for setting up the visual foundation of the game and preparing us for the next phase, which will involve scripting interactions like drag-and-drop and scoring.
UI Components Built:
Instruction Panel: Provides sorting guidance:
- 🔵 Blue – Paper
- 🟠 Orange – Plastic & Metal
- 🟤 Brown – Glass
- ⚫ Black – General Waste
- Score System: A placeholder
ScoreNumandScorecounter are included for future logic. - Navigation Bar: Includes buttons for Home, Game, Bin Finder, and Settings.
3D Models Imported:
- Several trash bin models have been added in
.fbxformat. - Materials and textures (like
RecycleBinTexture) were applied successfully. - Bin colors were changed using materials (e.g.,
ColourBin4), with different bins assigned their designated hues.
Although the game scene is still in progress, we’ve learned a lot from preparing its 3D elements. We imported bin models in .fbx format and experimented with applying materials, changing colors, and placing textures such as the recycling icon. Getting the material layering right was initially confusing, but we managed to assign separate materials to different mesh parts. While no interactivity has been added yet, this stage helped us understand Unity’s 3D environment better and laid the foundation for drag-and-drop functionality. It’s exciting to see the gameplay space take shape visually.
Canvas. I created key elements such as the Scan UI, Info Cards (Plastic, Paper, Glass), and integrated buttons like “Next” and “Back” to support scene navigation. The scene was structured inside a parent GameObject named ARScan, organizing all essential GameObjects for camera, UI, and AR targets.I installed the Vuforia Engine through Unity’s Package Manager to enable AR functionality. After integration, I used the Vuforia Model Target Generator (MTG) to upload and train 3D models such as a plastic bottle, dropper bottle, and glass serum bottle. These models were imported in both .obj and .fbx formats to test their tracking compatibility.
Within MTG, I created Model Targets for each material category — Plastic, Paper, and Glass — and configured a Guide View for each object to help the ARCamera recognize the model from a specific angle. Once training was complete, I imported the generated datasets into Unity to activate real-world tracking.
In Unity, I configured the ARCamera and placed the Model Targets into the scene. I also positioned the corresponding 3D models in the scene to test detection accuracy and alignment during runtime.
For the animation system in the AR Scan scene, I created separate Animator Controllers for each material type (Plastic, Paper, Glass). For example, the PlasticAnimator.controller was designed with two key states: PlasticSlideIn and PlasticSlideOut. I used Unity’s Animator window to define the flow of these states, where the animation begins with PlasticSlideIn when a plastic object is detected, and transitions to PlasticSlideOut when the object is no longer recognized. I organized the GameObject hierarchy so each material's info card has its own Animator and attached the correct controller. I linked these animations to the ModelTargetEvent.cs script, which listens for the object's tracking status and plays the appropriate animation using SetTrigger() methods. This setup allowed the information card UI to appear smoothly with a sliding motion when an object is scanned and hide again when the object is removed from view. Each material has its own unique animator and animation clips to ensure smooth and responsive interaction.
While preparing our Unity project for build and run, we encountered an issue that prevented the app from launching properly. With the help of Mr. Razif, we discovered that the problem was related to missing configurations in the Player settings. Specifically, we needed to enable "Render Over Native UI" under the iOS resolution and presentation settings. Additionally, we learned that an Event System was required in each scene to ensure that UI elements such as buttons could register input. These were small but crucial steps we had overlooked, and solving them gave us a better understanding of Unity’s build requirements and scene setup. It was a valuable debugging moment that strengthened our confidence in deploying the app correctly.
What’s In Progress / Coming Soon
One of the biggest lessons from this task was learning to embrace limitations and work around them creatively. For example, when a feature I initially wanted to implement wasn’t achievable with the available tools or time, I had to find an alternative that still fulfilled the user’s needs. This problem-solving process really pushed my creativity and technical adaptability.
Creating a walkthrough video for the prototype allowed me to reflect on my decisions and evaluate how intuitive my user journey really was. Seeing how all the pages and functions connected gave me a deeper appreciation for UX logic and consistency. I also became more conscious of accessibility and micro-interactions. While the prototype wasn’t perfect or fully polished, it served its purpose by allowing me to test the concept, learn through iteration, and gain confidence in applying technical tools.


























Comments
Post a Comment