Experiential Design - Task 3: Project MVP Prototype

⭐ 26/5/2024 - 30/7/2025 (Week 6 - Week 10)
🎀 Yan Zhi Xuan | 0369425 
💜 Experiential Design | Bachelor of Design (Hons) in Creative Media | Taylor's University 
📚 Task 3: Project MVP Prototype



TABLE OF CONTENTS  /ᐠ - ˕ •マ ⋆。°✩




1. LECTURES ⊹ ࣪ ˖₊˚⋆˙⟡

Week 6
Fig. 1.1 Creating User Interface & Screen Navigation (UI Elements in Unity).
YouTube video: Link

UI Element Sources: Google Drive

In Week 8, we continued exploring UI design in Unity, learning how to create interactive UI elements and navigate between multiple screens using buttons. This knowledge builds the foundation for managing app flows in AR or non-AR experiences.

UI Elements Setup:
  • Use CanvasPanelButton, and Text objects under Unity’s UI system.
  • Panels represent different "screens" or views (e.g., Menu, Info Page, AR Mode).
Button Interaction:
  • Use OnClick() events in the Inspector to link buttons to functions or actions. Example: A “Play” button deactivates the menu panel and activates the AR scene panel.
Screen Navigation Logic:
  • Toggle visibility of UI panels using SetActive(true/false) via attached C# scripts.Panels can be layered and shown/hidden to simulate screen transitions.
UI Folder Organization:
  • Group all UI elements in a dedicated “UI Elements” folder for cleaner project structure.
  • Each button and panel prefab can be reused across different scenes.
💜 Self-Reflection: This week helped me understand how to visually organize and control the flow of an app through UI. Learning how to switch between screens using buttons made me realize the importance of clear navigation in user experience design, especially for AR apps where the interface must be intuitive. I also gained hands-on experience in structuring my Unity project folders and UI logic properly, which will make future development more efficient and scalable.

Week 7
Fig. 1.2 Exporting Unity AR Project to Mobile Devices.
YouTube video: Link

In Week 7, we learned how to export and deploy a Unity AR project to a mobile phone, with a focus on understanding the platform-specific requirements for both Android and iOS.

Exporting to Android (Windows or Mac): 

  • Can build and export Unity projects to Android devices using either Windows or Mac. 
  • Unity needs the Android Build Support module installed (with SDK, NDK, and JDK).
Exporting to iPhone (iOS): 
  • Can only build iOS projects on a Mac. Windows systems cannot export directly to iPhone because they do not support Xcode, which is required for building and deploying to iOS devices. 
  • iOS projects must be exported as an Xcode project and then compiled and deployed using Xcode on a MacBook or other Apple device.
Final Steps for Deployment: 
  • After building, transfer the app file (APK for Android or Xcode project for iOS) to the device. 
  • For Android, you can test directly by copying the APK to the phone and installing it. 
  • For iOS, additional setup such as an Apple Developer account and provisioning profiles is required.

💜 Self-Reflection: This week helped me understand the technical steps and limitations of deploying AR apps to mobile devices. It was especially important to learn that exporting to iOS requires a Mac system with Xcode, which is a key consideration for future project planning. Now I’m more aware of the tools and environments needed for both platforms, and I feel more confident in preparing Unity projects for real-world testing on smartphones.

Week 8

Fig. 1.3 Creating User Interface & Screen Navigation (UI Elements in Unity).
YouTube video: Link

Installing macOS on Windows using virtual machine: YouTube Tutorial, Techrechard

In Week 8, we learned how to work with the Vuforia Engine in Unity to create markerless AR experiences using tools like AR Camera, Plane Finder, and Plane Stage. These components allow users to place digital objects onto real-world surfaces by scanning the environment with their device camera.

After setting up the scene, we proceeded to learn how to build and run the AR project on an iPhone using Xcode. This included exporting the Unity project to Xcode, configuring iOS signing credentials, and using the iOS simulator or a physical device for testing. The process required setting a valid Bundle Identifier, enabling automatic signing, and selecting a team profile to generate the provisioning and signing certificate.

💜 Self-Reflection: This week gave me a complete overview of how to bring an AR project from Unity into a real iPhone environment. Setting up Vuforia’s Plane Finder and Plane Stage helped me understand how AR elements can realistically align with physical space. Learning how to navigate Xcode and handle signing certificates made me feel more confident in preparing and testing apps for iOS, which is a big step toward real-world deployment.

Week 9
Fig. 1.4 Scaling & Virtual Space Building
YouTube video: Link

In Week 9, we learned how to create a virtual room environment in Unity using simple 3D objects such as cubes and planes to form walls, floors, and furniture. In class, we also focused on how to scale objects up and down accurately to match proportions and space. This involved adjusting the X, Y, and Z scale values in the Inspector to size each item properly within the room.

This practice helped us understand spatial relationships and object alignment, which are essential for designing believable environments in AR. The tutorial we were assigned to watch at home reinforced these skills by showing how to arrange and position elements to build a clean and functional virtual room.

💜 Self-Reflection: Learning how to scale and arrange 3D objects gave me a better grasp of Unity’s spatial tools. It was satisfying to see how quickly a room could come together using basic shapes and careful resizing. This session helped me see the importance of proportion and positioning when creating virtual spaces—especially for AR projects where realistic object sizes matter. It also made me more confident working with Unity’s transform tools.

Fig. 1.5 Creating a Virtual Room & Object Scaling in Unity. 
YouTube video: Link

In Week 9, we were tasked with creating a virtual room environment in Unity by following an assigned tutorial at home. The tutorial guided us through setting up a simple 3D room using primitive objects such as planes, cubes, and walls. We learned how to build a basic interior layout by adjusting the position, scale, and materials of each object, simulating the structure of a room.

Additional elements like lighting and camera angles were also covered to make the scene feel more realistic and immersive. This room can later serve as an interactive space within an AR experience, such as for interior design, object placement, or navigation tests.

💜 Self-Reflection: Following the tutorial helped me understand how to construct and arrange 3D environments inside Unity. It was interesting to see how basic shapes like cubes and planes could be transformed into walls, floors, and furniture just by tweaking their size and materials. I also gained more confidence working with Unity’s layout tools and using lighting to bring depth to the scene. This activity gave me a good foundation for building spatially-aware AR experiences in the future.

Week 10
Fig. 1.6 Video Player.
YouTube video: Link

In Week 10, we continued developing the Week 9 virtual room project, adding interactivity using Raycast detection. The goal was to detect when the user is looking at or tapping on specific objects—like a video screen (named "VideoPlane")—to trigger an action. In this case, the action was playing a video using Unity’s Video Player component.

🧠 Key Concepts Covered:

  • Raycasting: Used to detect if the user is pointing at a specific object.

    csharp
    if (hit.collider.gameObject.name == "VideoPlane") { Debug.Log("VideoPlane was hit"); // Trigger video playback here }
  • Video Player Integration: Add a Video Player component to the VideoPlane object. Assign a .mp4 video clip. Use Raycast detection to play the video when the object is hit.
This combination allows users to interact with 3D objects in AR, creating interactive storytelling or learning experiences inside the virtual room.

💜 Self-Reflection:  This week helped me bring together object detection and multimedia interactivity in Unity. By using Raycast to detect when the user is looking at a specific object and triggering a video, I learned how to create more immersive and responsive AR content. I also improved my understanding of basic scripting logic and how to connect different Unity components like colliders, raycasting, and the video player. This hands-on exercise made me more confident in building nfunctional, interactive AR prototypes.



2. INSTRUCTIONS ⊹ ࣪ ˖₊˚⋆˙⟡

Fig. 2.1 Module Information Booklet - Experiential Design.

Task 3: MVP Prototype (Week 06–10, 20%):

  • Build a functional app prototype (Figma).
  • Test usability and key interactions.
  • Submit video walkthrough + blog post.



3. TASK 3: PROJECT MVP PROTOTYPE ⊹ ࣪ ˖₊˚⋆˙⟡

Week 6

Visualization

Mood Board

To establish a consistent look and feel, we created a mood board that reflects our app’s tone. Our visual references include AR UI examples, eco-themed illustrations, and a clean interface layout to guide our design direction. The goal is to make recycling approachable, educational, and even fun.

Our color palette includes both brand and functional colors. Brand colors like light green, dark green and black, while green, white, and light yellow express sustainability and freshness. We also include recycling bin colors – blue, orange, brown, and black – to match Malaysia’s real-life bins, helping users connect the app with real-world actions.

For typography, we use Poppins, a clean and modern sans-serif font. We apply different font weights to establish hierarchy, from bold titles to readable body text. This ensures clarity on all screens, especially on mobile devices.

Fig. 3.1 Mood Board of GreenLens.

We used a 5-column auto-width grid system based on the iPhone 14 Pro Max frame with a 16-pixel gutter to ensure consistent spacing and mobile-friendly alignment across the app UI.

Fig. 3.2 GreenLens UI Kit (Figma).

User Flowchart of GreenLens

Fig. 3.3 GreenLens User Flowchart (Figma).

GreenLens AR App Wireframes and Mockups

Figma Link: Draft Final Outcome 

Fig. 3.4 GreenLens AR App Wireframes and Mockups (Figma).

Loading and Launch Pages: The app opens with a clean loading screen featuring our brand icon. Then, users are guided through three launch screens — Scan & Sort, Learn and Recycle, and Find Nearby Bins — which briefly explain the core features of the app in a friendly and simple way. 

Home Page: On the home screen, users can see how many coins they’ve earned, how many items they’ve recycled, and how much CO₂ they’ve saved. This gives a quick sense of personal environmental impact.

Scan Interface: Users can scan an item, and our AR system identifies the material and guides them to the correct bin with visuals and instructions.”

Recycling Tips Page: After scanning, the app shows item-specific recycling tips — like whether to remove labels, flatten bottles, or which bin color to use — ensuring that users recycle correctly.

Bin Finder Page: Bin Finder uses location services to help users find the nearest recycling bins, matched by the user’s location.

Recycling Summary Page: Once recycling is completed, users are shown a summary screen with stats: total items recycled, coins earned, and CO₂ saved — turning eco-actions into trackable progress.

Game Pages: In the gamified mode, users are instructed to throw virtual waste into the correct bins. If they make a mistake, they get feedback like ‘Oops! Wrong bin.’ If they get it right, they’re rewarded with a cheerful animation saying ‘You sorted it right!’ — encouraging continuous engagement.



Week 8

First Stage of Functioning Prototype Development in Unity

Loading Page

We began our app development journey by designing a simple yet effective loading scene. It served as the first touchpoint between the user and the app, displaying the GreenLens logo and setting the tone for the experience ahead. We wanted it to feel quick and purposeful, not just a delay screen. Technically, this part was straightforward, but we encountered a small font rendering issue with Unicode characters in the Poppins-Medium SDF, which we resolved using fallback characters. Overall, it gave us a solid start and helped establish a clean design standard from the beginning.

Fig. 3.5 Loading Page and AutoSceneLoader.cs

The Loading Page is the introductory splash screen for the GreenLens application. It provides an initial branding experience before users are transitioned into the interactive sections of the app.

Purpose: To display the app logo and tagline briefly while backend assets initialize. Acts as a visual buffer to ensure a smooth transition to the Launching or Home scene.

UI Elements:

  • Logo: A large, centered GreenLens logo with magnifier icon representing recycling assistance.
  • Tagline: “Your Recycling Tutor Assistant” reinforces the app’s function and value.
  • Background: Clean green tone, symbolizing sustainability and eco-friendliness.
  • Canvas Setup: Uses Unity’s UI Canvas system and is layered above the camera and directional light.

Scene Transition: This page is loaded at app start and will call the SceneLoader.cs script to transition to the “Launching” scene after a short delay or animation sequence.

Launching Pages

We have the launching scene, as it functioned like an onboarding tutorial for users. We used Illustrator to design clean vector assets and exported them as SVGs to keep the file lightweight. The challenge was ensuring the visuals were simple enough to understand while still being engaging. This scene helped us bridge design and functionality, setting up users with the confidence to start scanning and sorting waste using AR.

Fig. 3.6 Launching Scene and SceneManager.cs

The Launching Scene acts as the onboarding screen of our AR Recycling Sorter app. Built in Unity, this scene introduces users to the purpose of the app with a clean and friendly visual style.

Scene Functionality:
  • Purpose: Briefly guides users to use their camera to scan items and receive AR-based sorting instructions.
  • Message: “Use your camera to scan any item and get instant AR guidance on the correct recycling bin to use.”
  • Interaction: Includes a “Next” button that transitions users to the next tutorial or directly into the Home Page/Scan mode.

UI Elements:
  • Panel > Background: Holds the main layout and colour background.
  • Icon A–C: Animated or static illustrations representing the scanning process (recyclable bin with a magnifying glass).
  • NextBtn: Interactive button component styled with TextMesh Pro, connected to navigation logic.
  • Title & Body Text: Centre-aligned onboarding message styled with SDF fonts for crisp display.
Fig. 3.7 Other Scripts for Launching Scene.

Fig. 3.8 Sketchfab / Unity plugin.

I found that importing 3D models from Sketchfab directly through the Unity plugin is a fast and efficient way to enhance AR experiences, especially when I'm prototyping or adding quick visual assets. It automatically brings in the model with textures and materials, saving me a lot of setup time. However, when I tried importing manually using FBX, I realized I had to manually assign the materials and textures, which was quite tedious. I learned that using glTF format is a better option because it preserves the textures and hierarchy more accurately in Unity. This process taught me how important it is to choose the right file format when working with 3D assets in AR projects.


Week 8

Second Stage of Functioning Prototype Development in Unity

Home Page

During Week 8, we designed the home scene allowed us to experiment with user personalization and real-time feedback. We incorporated welcoming text, user stats like coins earned and CO₂ saved, and a visually engaging interface. One of our priorities was creating a layout that feels both informative and rewarding. We encountered some difficulty with spacing and alignment across devices, but we adjusted the canvas settings to handle different screen sizes. Integrating the navigation bar made the scene feel more complete. This scene became our central hub and helped us understand how to balance design with dynamic content needs.

Fig. 3.9 Home Scene.

The Home Page serves as the main dashboard of the GreenLens AR app. It welcomes users and provides quick insights into their recycling impact, as well as easy access to other core features.

Header Section:

  • User Greeting: Personalized with the user’s name (e.g., “Hi, Yan”).
  • Avatar/Character Icon: Adds a fun, gamified feel to the page.

Score Stats Panel:

  • Coins Earned: Tracks gamified rewards earned through recycling.
  • CO₂ Saved: Displays how much carbon dioxide (CO₂) has been reduced by recycling.
  • Scan Activity Icon: A visual indicator tied to scan usage or progress.

Info Section:

  • Text reads: “How much carbon dioxide (CO₂) emission you can avoid from recycling different materials?”
  • Tapping the info button opens the Study Scene (as seen in the OnClick() event in the Button component).

Scan Button:

  • Direct call-to-action: “Scan Item” button for initiating the item recognition feature.

Bottom Navigation Bar: Home | Game | Scan | Bin Finder | Settings

Study Page

We developed the study scene to support user education by explaining the environmental impact of recycling. Presenting scientific data in a digestible way was important to us, so we created a table showing average CO₂ savings for different materials and included a simple formula users could understand. It was a rewarding challenge to simplify technical information while maintaining accuracy. By connecting learning to in-app actions, we hope to enhance user motivation and give deeper meaning to their recycling efforts. This scene turned out to be a strong informational anchor for the app.

Fig. 3.10 Study Scene.

The Study Scene is an educational page designed to help users understand the science behind recycling and how it translates to carbon savings.

Features:

  • Material Breakdown Table: Shows average CO₂ savings per kg of recycled material:
    • Aluminum: ~10.9 kg CO₂ saved
    • Plastic: ~3.5 kg CO₂
    • Paper: ~4.3 kg CO₂
    • Glass: ~0.3 kg CO₂
    • Steel: ~1.2 kg CO₂
  • Formula Section: CO₂ Saved = Weight of Material × CO₂ Factor (Includes a user-friendly example for better comprehension.)
  • Navigation Bar: Includes access to other parts of the app (Home, Scan, etc.) for seamless movement between learning and action.

Summary Page

The summary scene gave us the opportunity to reinforce positive feedback and celebrate user achievements. After scanning or recycling an item, users see their updated stats and a motivational message—like saving enough energy to light an LED bulb. We focused on making the layout feel rewarding without overwhelming the user. The buttons to scan another item or return home were designed for seamless flow. This scene tied the experience together and made users feel that their actions had value. It was fulfilling to build something that could potentially boost eco-friendly habits.

        Fig. 3.11 Study Scene.

        The Summary Scene provides immediate feedback after a user scans or sorts an item, offering a sense of achievement and reinforcing positive behavior.

        Features:

        • Summary Stats: Displays key recycling metrics (Coins Earned, Items Recycled, CO₂ Saved)
        • Achievement Message: Example: "You’ve saved enough energy to light an LED bulb for 6 hours!" This adds a tangible, real-world context to the environmental impact.
        • Navigation Buttons: 
          • Scan Another Item – loops user back to the scanning scene. 
          • Back to Home – returns user to the dashboard/home page.
        UI Setup: 
        Elements organized under Canvas > SummaryStats and Box, with all values using TextMesh Pro for clean display.


        Week 9

        Third Stage of Functioning Prototype Development in Unity

        Game Scene

        For Task 3, we haven’t started coding yet, but we focused on exploring 3D model compatibility and customization in Unity. Our main goal was to test different file formats (such as .fbx) and successfully import them into the scene. We experimented with changing the color of the bins and applying textures or icons like the recycling symbol. This hands-on process helped us understand how materials, shaders, and mesh renderers work in Unity. Although there's no gameplay logic yet, this stage was important for setting up the visual foundation of the game and preparing us for the next phase, which will involve scripting interactions like drag-and-drop and scoring.

        Fig. 3.12 Game Scene.

        The Game3D Scene is currently under construction, serving as an interactive game space where users can drag and drop waste items into the correct recycling bins. It aims to educate users on waste sorting through an engaging and visual experience.

        [Current Setup]

        UI Components Built:

        • Instruction Panel: Provides sorting guidance:

          • 🔵 Blue – Paper
          • 🟠 Orange – Plastic & Metal
          • 🟤 Brown – Glass
          • ⚫ Black – General Waste
        • Score System: A placeholder ScoreNum and Score counter are included for future logic.
        • Navigation Bar: Includes buttons for Home, Game, Bin Finder, and Settings.
              Fig. 3.13 Game Scene 3D Assets.

              3D Models Imported:

              • Several trash bin models have been added in .fbx format.
              • Materials and textures (like RecycleBinTexture) were applied successfully.
              • Bin colors were changed using materials (e.g., ColourBin4), with different bins assigned their designated hues.

              Although the game scene is still in progress, we’ve learned a lot from preparing its 3D elements. We imported bin models in .fbx format and experimented with applying materials, changing colors, and placing textures such as the recycling icon. Getting the material layering right was initially confusing, but we managed to assign separate materials to different mesh parts. While no interactivity has been added yet, this stage helped us understand Unity’s 3D environment better and laid the foundation for drag-and-drop functionality. It’s exciting to see the gameplay space take shape visually.


              Week 10

              Fourth Stage of Functioning Prototype Development in Unity

              3D Scanner - Polycam

              Fig. 3.14 3D Scanner Apps.

              During our AR exploration, Mr. Razif recommended that we use Model Targets in Vuforia Engine for better object recognition, especially for items with distinct shapes like bottles or cups. He explained that Model Targets allow for full 360° tracking and are more stable compared to image-based tracking when working with 3D objects. To generate these models, he also suggested using Polycam or other 3D scanning tools to capture real-world objects digitally.
               
              Fig. 3.15 Scanning with Polycam.

              We followed his advice and scanned several items including a Coca-Cola bottle, a Shell Café cup, and a vitamin bottle using Polycam. While some results were successful, others lacked clean mesh or accurate textures, especially for items with shiny or curved surfaces. This experience helped us understand the importance of model quality and scanning technique. 


              Fig. 3.16 Some Scanned Item Outcomes.

              Some models turned out decent, others—like the crumpled or reflective surfaces—produced less accurate results. This was especially noticeable in the Shell cup, where the mesh and texture appeared distorted or incomplete. We realized that lighting, camera angle, and object texture significantly affect the outcome. Despite these limitations, the process gave us a better understanding of photogrammetry and the importance of choosing the right object type for reliable recognition and tracking in AR environments.

              AR Scan Scene (My Work-on Feature)

              Fig. 3.17 AR Scan Scene.

              To begin the AR Scan scene, I built the foundational scene setup in Unity, starting with the UI layout under the Canvas. I created key elements such as the Scan UI, Info Cards (Plastic, Paper, Glass), and integrated buttons like “Next” and “Back” to support scene navigation. The scene was structured inside a parent GameObject named ARScan, organizing all essential GameObjects for camera, UI, and AR targets.

              Fig. 3.18 Vuforia Engine Model Target Generator.

              I installed the Vuforia Engine through Unity’s Package Manager to enable AR functionality. After integration, I used the Vuforia Model Target Generator (MTG) to upload and train 3D models such as a plastic bottle, dropper bottle, and glass serum bottle. These models were imported in both .obj and .fbx formats to test their tracking compatibility.

              Within MTG, I created Model Targets for each material category — Plastic, Paper, and Glass — and configured a Guide View for each object to help the ARCamera recognize the model from a specific angle. Once training was complete, I imported the generated datasets into Unity to activate real-world tracking.

              In Unity, I configured the ARCamera and placed the Model Targets into the scene. I also positioned the corresponding 3D models in the scene to test detection accuracy and alignment during runtime.


              Fig. 3.19 Some Scanned Item Outcomes.

              I found the 3D model assets from Freepik and CGTrader. I then imported these FBX. files to Unity ARScan Scene. Each object has its own: AR model Detection script, and a custom InfoCard prefab that slides in when scanned.

              Fig. 3.20 Info Card and Animator,

              For the animation system in the AR Scan scene, I created separate Animator Controllers for each material type (Plastic, Paper, Glass). For example, the PlasticAnimator.controller was designed with two key states: PlasticSlideIn and PlasticSlideOut. I used Unity’s Animator window to define the flow of these states, where the animation begins with PlasticSlideIn when a plastic object is detected, and transitions to PlasticSlideOut when the object is no longer recognized. I organized the GameObject hierarchy so each material's info card has its own Animator and attached the correct controller. I linked these animations to the ModelTargetEvent.cs script, which listens for the object's tracking status and plays the appropriate animation using SetTrigger() methods. This setup allowed the information card UI to appear smoothly with a sliding motion when an object is scanned and hide again when the object is removed from view. Each material has its own unique animator and animation clips to ensure smooth and responsive interaction.

              Fig. 3.21 CardAnimationController.cs and ModelTargetEvent.cs,

              For animations, I also built a dedicated script called CardAnimationController.cs and later refactored it to include SetActive(true/false) logic, ensuring the cards appeared and disappeared smoothly based on tracking status. To connect Vuforia’s tracking feedback with UI card activation,

              I wrote a new script, ModelTargetEvent.cs, using an enum to detect different material types. I used ObserverBehaviour.OnTargetStatusChanged to trigger animation methods from CardAnimationController.cs accordingly. This allowed each material card (plastic, paper, or glass) to animate only when the correct model was detected. Following Mr. Razif’s advice, I tested various 3D models exported from Freepik, CGTrader, and Polycam to ensure compatibility and visibility of object features for better tracking. He also encouraged using Model Target instead of Image Target for stability and real object detection. Finally, I built the animation system into Unity by connecting the scripts, animators, and model target events. I tested interactions using Unity's Play Mode and refined the script to make sure each card worked independently based on the recognized object.


              Week 11

              Last Stage of Task 3

              Asset Building / Collection

              Fig. 3.22 Font

              Fig. 3.23 Scenes.

              Fig. 3.24 Scripts.

              Fig. 3.25 UI Elements.

              Fig. 3.26 3D Assets.

              Fig. 3.27 Animations.

              We maintained a clean folder structure in Unity for better asset management. All 3D models were stored in the 3DModel folder, animations in Animation, UI graphics in UIelements, and logic in Scripts. This organization helped during debugging and testing, especially when building the app for iOS using Unity 6.0.

              At this stage, we have not fully completed the coding for Task 3, but we’ve made foundational progress by experimenting with 3D model compatibility and Vuforia integration. We've established key systems for animation and scene transitions, and confirmed that our structure is scalable for future tasks such as scoring, quizzes, or data logging. Overall, this phase helped us better understand how AR object tracking works, how Unity’s Animator integrates with real-time events, and how essential clean asset management is in a multi-scene project.


              Trying to Build and Run for launching the app on the iPhone

              Fig. 3.28 "Render Over Native UI"

              While preparing our Unity project for build and run, we encountered an issue that prevented the app from launching properly. With the help of Mr. Razif, we discovered that the problem was related to missing configurations in the Player settings. Specifically, we needed to enable "Render Over Native UI" under the iOS resolution and presentation settings. Additionally, we learned that an Event System was required in each scene to ensure that UI elements such as buttons could register input. These were small but crucial steps we had overlooked, and solving them gave us a better understanding of Unity’s build requirements and scene setup. It was a valuable debugging moment that strengthened our confidence in deploying the app correctly.

              What’s In Progress / Coming Soon

              Fig. 3.29 What has been done and what still needs work.

              Fig. 3.29 Work Distribution.


              ⭐ Final Submission

              1. Prototype Walkthrough Video: Google Drive / YouTube
              2. Presentation Slide: Canva Slide
              3. Presentation Video: Google Drive / YouTube
              4. Google Drive folder: Link

              GreenLens Task 3 Prototype Walkthrough Video

              Fig. 3.30 GreenLens Task 3 Prototype Walkthrough Video. 
                *** While watching the video, please select 1080p (HD) for clear quality view.

              GreenLens Task 3 Presentation Recording with the Prototype Walkthrough Video

              Fig. 3.31 GreenLens Task 3 Presentation Recording.
                *** While watching the video, please select 1080p (HD) for clear quality view.

              GreenLens Task 3 Presentation Slide



              4. FEEDBACK ⊹ ࣪ ˖₊˚⋆˙⟡
              Fig. 4.1 Online Consultations with Mr. Razif.

              Week 5

              Mr. Razif mentioned that the first version of the logo needs revision — some text should be enlarged, and additional spacing is needed to avoid crowding, making the layout more spacious and user-friendly.

              Week 6

              Mr. Razif has approved the new brand logo. As for the visual elements on the game page, he said there's no need to overly decorate them because the design is meant to interact with real-world spaces. He also suggested adding a "Pause Game" icon and a gameplay demonstration.

              Week 7
              Mr. Razif said the colours we used are fine. He also told us not to stress too much about whether the code works at this stage — just focus on researching and organising what we currently have. Even if the outcome isn’t successful, it’s okay. There will be more lessons later on where we can learn and apply the knowledge to future projects.

              Week 8-9
              Mr. Razif advised us to focus on developing the MVP (Minimum Viable Product) flow for our AR project and bring it to a functioning prototype stage. He also suggested that we refer to our seniors' blog documentation for guidance and inspiration on how to structure our process, showcase key features, and present our development clearly.

              Week 10

              No feedback given.



              5. REFLECTIONS    /ᐠ - ˕ •マ

              Developing the MVP prototype was both thrilling and challenging. It was the first time I saw my ideas come to life in a tangible form. At first, I underestimated the complexity of translating visual ideas into working flows. I had to rethink some of my designs because they didn’t function as well as I imagined once I started prototyping them.

              One of the biggest lessons from this task was learning to embrace limitations and work around them creatively. For example, when a feature I initially wanted to implement wasn’t achievable with the available tools or time, I had to find an alternative that still fulfilled the user’s needs. This problem-solving process really pushed my creativity and technical adaptability.

              Creating a walkthrough video for the prototype allowed me to reflect on my decisions and evaluate how intuitive my user journey really was. Seeing how all the pages and functions connected gave me a deeper appreciation for UX logic and consistency. I also became more conscious of accessibility and micro-interactions. While the prototype wasn’t perfect or fully polished, it served its purpose by allowing me to test the concept, learn through iteration, and gain confidence in applying technical tools.

              Comments

              Popular posts from this blog

              Design Principles - Task 3: Design

              Design Principles - Final Compilation

              Information Design - Project 1 & 2