In Week 1, Mr. Razif briefed us on the instructions and requirements for each task. He introduced the concept of experiential design, explaining its importance and application. He also encouraged us to use ChatGPT to help brainstorm proposal ideas, particularly focusing on Augmented Reality (AR). To deepen our understanding, we were shown examples of past students' work, which served both as a reference and a source of inspiration for our own projects.
Week 2
In Week 2, we learned about the different terms and disciplines related to user experience. UX (User Experience) focuses on how users interact with products at specific touchpoints, while XD (Experience Design) takes a broader view of the entire journey, including environment and context. We also explored related fields such as UI (User Interface), CX (Customer Experience), IxD (Interaction Design), UXE (User Experience Ecosystem), and IA (Information Architecture), each playing a key role in shaping how users engage with digital and physical services.
Mr. Razif introduced us to various UX mapping tools to help us understand and design better experiences. We started with the Empathy Map, which helps identify what users say, think, do, and feel—this helps teams align and make better design decisions. The Customer Journey Map visualizes a user’s full experience across phases and emotions, and we learned that it can be paired with storyboarding and used to propose future improvements. We also looked at the Experience Map, which gives a high-level overview of user emotions and behavior across multiple phases. Finally, the Service Blueprint breaks down both frontstage (what users see) and backstage (supporting actions) components, giving a complete picture of how a service is delivered. We also had Class Activity 1.
Week 3
During the lecture, we explored the distinctions between AR (Augmented Reality) and MR (Mixed Reality) by engaging in an activity using Google’s AR feature on a smartphone, where we viewed life-sized 3D animals superimposed onto real-world surroundings. This experience exemplified AR, as the virtual objects were overlaid onto the real environment without interactive spatial awareness, unlike MR, which involves dynamic interaction with physical objects. This exercise clarified that AR focuses on visual augmentation, while MR integrates virtual elements more interactively within the physical space. After that, we also had Class Activity 2. The AR Design Workshop 1introduced the fundamentals of creating augmented reality applications using Unity and Vuforia. It covered setting up a project, configuring AR components, and integrating image targets to trigger 3D objects in a real-world environment. Participants gained hands-on experience in building and testing AR content, enhancing their understanding of how digital assets can interact with physical surroundings to create immersive user experiences.
Week 4
During the lecture, we continued building on the AR experience by implementingAR Design Workshop 2- screen navigation and cube animation using Unity and Vuforia. We started by adding a Canvas with Hide and Show buttons, allowing users to control the visibility of AR content through simple screen navigation. Subsequently, we enhanced interactivity by animating the 3D cube using the Animation tab, creating movements like rotation, scaling, and position shifts. The animation was triggered through buttons and controlled via a C# script, demonstrating how user inputs can dynamically manipulate AR objects, creating a more engaging and interactive experience.
2. INSTRUCTIONS ⊹ ࣪ ˖₊˚⋆˙⟡
Fig. 2.1 Module Information Booklet - Experiential Design.
Task 1: Trending Experience (Week 01–04, 20%):
Explore current design/tech trends.
Identify platform features and limitations.
Submit weekly exercises + reflective blog post.
3. TASK 1: TRENDING EXPERIENCE ⊹ ࣪ ˖₊˚⋆˙⟡
A. Research & Exploration
Reflecting on Trending AR Experiences
Fig. 3.1 IKEA Place app.
📱💬 Research & Exploration of IKEA Place app
I explored current AR trends provided valuable insights into how augmented reality is being utilized to transform user experiences across various industries. One of the most inspiring examples is the IKEA Place app, which allows users to virtually place furniture in their own space using AR. This application not only enhances the shopping experience but also addresses a practical need – helping users visualize how furniture will look and fit before purchasing. The realistic 3D models and scaling make it easier for users to make informed buying decisions, bridging the gap between digital and physical retail.
Fig. 3.2 Pokémon GO.
📱💬 Research & Exploration of Pokémon GO
Another captivating AR experience I explored is Pokémon GO, which successfully gamified AR by blending virtual characters with real-world locations. Players can explore their surroundings to find, capture, and battle Pokémon, creating an engaging outdoor adventure. The app’s use of geolocation and AR made it a global phenomenon, demonstrating how AR can encourage physical activity and social interaction through immersive gameplay.
Fig. 3.3 Snapchat AR Lenses.
📱💬 Research & Exploration of Snapchat AR Lenses
Additionally, the Snapchat AR Lenses showcase how AR can be used for entertainment and brand engagement. Brands like Prada and Nike have leveraged Snapchat’s AR lenses to let users virtually try on products, promoting new collections and increasing user interaction. This trend illustrates how AR can drive marketing campaigns while providing users with a fun, interactive shopping experience.
Summary:
These examples emphasize the growing potential of AR in enhancing user experiences, making information more accessible, engaging, and memorable. The common theme across all these applications is how effectively AR merges the physical and digital worlds, creating a more interactive and user-centric environment. As AR technology continues to evolve, its applications in retail, education, gaming, and healthcare will likely expand, offering even more impactful experiences.
B. Weekly Reflections on Class Activities and Exercises
Week 1
Mr. Razif mentioned that we could choose to work individually or in pairs for the project proposal. I decided to team up with Tan Sin Mi, and together we began brainstorming and discussing six different proposal ideas. We organized our ideas in a Google Doc and planned to consult with Mr. Razif in the following week for feedback and further refinement.
Week 2
Class Activity 1 - User Journey Map
Mr. Razif assigned us a group activity to construct a theme park journey map. This task was part of our ongoing experiential design exploration. Our group (Group 2) consists of the following members: Yan Zhi Xuan (0369425) – me, Natalie Chu Jing Xuan (0354589), Chan Xiang Lam (0358400), Tan Sin Mi (0368821), Hanson Pea Wei Hao (0359463) and Sin Jun Ming (0364638).
We decided to focus on Sunway Lagoon as our selected theme park due to its wide range of attractions and diverse user touchpoints, which provided meaningful areas for UX improvement.
🗺️ The journey map was collaboratively constructed on Miro, where we identified:
Gain points (positive experiences),
Pain points (frustrations or challenges),
And solutions — including the integration of Augmented Reality (AR) to enhance visitor engagement and flow.
We organized each touchpoint from arrival (parking, ticketing) to final experiences (gift shop), ensuring a comprehensive analysis of the park experience. After presenting, Mr. Razif reviewed our Sunway Lagoon journey map and suggested that we incorporate more Augmented Reality (AR) experiences into the touchpoints. He encouraged us to think beyond basic solutions and explore how AR could enhance navigation, queuing, interaction, and overall engagement throughout the theme park journey. This feedback helped us rethink our approach and integrate more innovative, tech-driven ideas into our updated journey map. The following is the Miro board of Sunway Lagoon's initial journey map, updated journey map and the future journey map:
Fig. 3.4 Miro Board of Sunway Lagoon Journey Map.
💜Self-Reflection- Journey Map
After working on the Sunway Lagoon journey map, I realised it was an eye-opening experience in understanding the flow of a user’s journey through a theme park environment. By analyzing each touchpoint from parking to the gift shop, I gained a clearer perspective on how user interactions and emotions fluctuate throughout the visit. Mr. Razif’s feedback to integrate more AR solutions challenged us to think beyond basic fixes and consider how AR could elevate the experience through interactive maps, virtual queuing, and immersive content. Collaborating with my group members also fostered a deeper exchange of ideas, allowing us to identify diverse pain points and propose comprehensive solutions. It emphasized the importance of aligning user needs with technological enhancements to create a more engaging and seamless experience. Overall, this activity not only improved my skills in user journey mapping but also reinforced the value of user-centric design thinking in crafting memorable and impactful experiences.
Week 3
Fig. 3.5 Task 1 & Activity 1.
Task 1: Identify XR Experience - AR or MR
Objective: The objective of this task is to understand the difference between Augmented Reality (AR) and Mixed Reality (MR) by identifying which type of XR experience is demonstrated in a given example.
Activity 1: Viewing 3D Animals via AR
Objective: To practically experience AR using Google’s AR feature on a smartphone.
💜Self-Reflection - Task 1 & Activity 1
During the lecture and activity, I learned the fundamental differences between AR and MR by experiencing Google’s AR feature firsthand. Initially, I assumed the interactive nature of the tiger model might indicate MR, but I realized that true MR requires spatial awareness and interaction with the environment, which was not present in this case. Through this activity, I better understood that AR primarily focuses on overlaying digital content onto the real world, while MR goes a step further by integrating virtual elements interactively within the physical environment. This realization helps in distinguishing applications where simple visualization suffices versus those that require real-world interaction. I appreciated the hands-on experience as it bridged theoretical concepts with real-world applications, reinforcing the practical understanding of AR technology.
Class Activity 2 - AR Experience (Scenario-Based Design)
Fig. 3.6 Task 2 & Scenario Example by Mr. Razif.
Task 2: AR Experience and Extended Visualization in Real-Life Scenarios
Objective: To conceptualize an AR experience within a specific environment (Kitchen, Shopping Mall, or Gym) and determine how extended visualization can enhance user interaction and experience.
Mr. Razif show us an example of the kitchen scenario AR experience and assigned us a group activity to construct a scenario AR Experience slideshow with explanations. Our group (Group 2) consists of the following members: Yan Zhi Xuan (0369425) – me, Natalie Chu Jing Xuan (0354589), Chan Xiang Lam (0358400), Tan Sin Mi (0368821), Hanson Pea Wei Hao (0359463) and Sin Jun Ming (0364638). We decided to work on the Gym Scenario AR Experience.
The following is our Gym Scenario AR Experience slideshow:
In this task, our group developed the GymGuide AR concept, focusing on assisting beginners like Alex in navigating the gym environment confidently. Through this experience, I gained deeper insights into how AR technology can provide real-time guidance and visual feedback, particularly in contexts where users may feel self-conscious or uncertain, such as the gym. The process of designing the AR system emphasized the importance of personalized support and intuitive visual cues, ensuring that users receive effective, hands-free guidance without needing to ask for help. Additionally, conceptualizing features like the Rep & Set Tracker and Virtual Personal Trainer allowed me to explore how AR can reduce anxiety, enhance workout accuracy, and build user confidence. This project underscored how experiential design can transform everyday activities into more engaging and supportive interactions, effectively bridging the gap between technology and physical activity.
AR Design Workshop 1 - AR Experience Development (Unity & Vuforia)
Fig. 3.8 AR Design Workshop 1.
In this session, we learned how to create a simple AR experience using Unity and Vuforia. The process involved several key steps:
1. Setting Up the Project in Unity: We began by creating a new project in Unity using the Universal 3D template, as it supports AR development with the Universal Render Pipeline (URP).
2. Adding Vuforia SDK: We imported the Vuforia SDK to enable AR functionalities. This SDK helps in recognizing image targets and rendering 3D objects over them.
3. Creating the Scene: The Unity scene was set up with essential components:
ARCamera: Detects the image target and aligns 3D content with it.
Image Target: The specific visual marker that triggers the AR content.
3D Object: A cube was used as a simple 3D model to display upon target recognition.
4. Vuforia Target Database: We uploaded an image target to Vuforia Developer Portal, created a target database, and downloaded the database to Unity. My target was named Spiderwoman, rated with 5 stars for tracking quality.
5. Configuring the AR Experience: The cube was positioned over the image target in Unity, allowing it to appear when the target is detected. Testing was done using the Unity editor to simulate how the AR object responds to the target image.
6. Building and Testing: The AR project was tested on a mobile device to ensure that the 3D object aligns correctly with the image target.
💜 Self-Reflection - AR Development
This AR development session was a valuable hands-on experience in integrating Unity and Vuforia for augmented reality projects. Setting up the AR scene and configuring the image target provided a clear understanding of how AR content is triggered and displayed. Initially, the process felt complex, especially in managing the Vuforia database and ensuring that the Unity setup matched the Vuforia configuration. However, once the components were properly aligned, seeing the AR content successfully rendered on the mobile device was rewarding. Collaborating with classmates and referring to the tutorial video helped clarify uncertainties and provided a better grasp of the technical flow. This exercise also reinforced the importance of precise asset placement, as minor misalignments can disrupt the AR experience. Overall, this project enhanced my confidence in building AR experiences and laid a strong foundation for more complex applications in future projects.
Week 4
AR Design Workshop 2 - AR Experience Screen Navigation & Cube Animation
Fig. 3.9 AR Design Workshop 2.
Fig. 3.10 AR Design Workshop 2 Outcome Video (YouTube).
In this session, we continued from the previous tutorial, focusing on screen navigation within the AR experience. The objective was to implement interactive buttons that allow users to navigate between screens within the AR project. We maintained the same Spiderwoman target image for consistency.
1. Setting Up the Scene: We started by creating a new scene in Unity, selecting the Basic (Built-in) template to maintain a simple interface.
2. Canvas and UI Setup: A Canvas was added to the scene to hold the user interface elements. Inside the Canvas, we created a Panel to act as the background of the screen. Two buttons, Hide and Show, were added to the Canvas, each with assigned functionalities to control the visibility of AR content.
3. Button Functionality: We implemented the button functionality using Unity’s UI Button component. The Hide button makes the AR content invisible, while the Show button makes it reappear. This interaction provides a simple screen navigation system, allowing the user to control what appears on the screen.
4. Script Implementation: We attached "Set Active" to the component to to toggle the visibility and functionality of the cube, linking each button to its respective function. The script controlled the panel’s visibility based on the button press, demonstrating how user inputs can be integrated into the AR experience.
5. Testing and Debugging: The project was tested to ensure that the buttons responded correctly to user inputs and that the AR target image continued to function as intended.
💜Self-Reflection: Screen Navigation
This session was particularly insightful in understanding how UI elements and AR content can work together to create interactive experiences. Setting up the buttons and configuring their functions provided valuable hands-on practice in connecting user interface components with AR elements in Unity. Initially, the challenge was ensuring that the buttons were properly linked to the C# script and that the interactions were smooth and responsive. Testing the project revealed how slight adjustments in the Canvas settings could impact the user experience, emphasizing the importance of precise alignment and scaling. Overall, this exercise reinforced the concept of screen navigation in AR, demonstrating how simple interactions like Hide and Show can significantly enhance user control and engagement in augmented reality applications.
Following the screen navigation tutorial, we proceeded to implement animation for the 3D Cube within the AR experience. This step aimed to enhance interactivity by making the cube respond to user inputs.
1. Scene Setup: We continued using the same AR scene with the Spiderwoman target image as the AR marker. The existing cube was retained as the primary object to animate.
2. Creating the Animation: The Animation tab was opened in Unity, and a new animation clip was created named CubeAnimation. The animation was set to include simple transformations such as rotation, scaling, and position movement to make the cube visually engaging. Keyframes were added to define specific movements and transitions over time.
3. Animation Controller: An Animator Controller was created and assigned to the cube, linking the CubeAnimation to the controller. The Animator window allowed us to control the animation’s flow, adjusting the speed and transition settings.
4. Triggering the Animation: Animation was recorded to handle the animation trigger. The component was configured to play the animation when the AR target is detected. Additionally, buttons were implemented to stop and play the animation.
5. Testing and Debugging: The project was tested to verify that the cube animation played correctly upon AR target detection.
💜Self-Reflection: Cube Animation
Adding animation to the cube was an engaging and visually impactful step in our AR project. It allowed us to explore how motion can enhance user interaction and make AR content more dynamic. Initially, creating smooth transitions and controlling the animation flow was challenging, as minor timing adjustments significantly affected the outcome. Integrating user controls through buttons further emphasized how user inputs can manipulate AR objects, creating a more interactive and immersive experience. Overall, this session underscored the importance of animation in conveying actions and attracting user attention, setting the foundation for more advanced interactions in future AR projects.
⭐ Final Submission
C. Proposal of 3 AR Project Ideas
3 AR Project Ideas
My partner is Tan Sin Mi, and each of us working on 3 project ideas. For each idea, include:
1. Introduction
2. Ideation
3. Target Audience
4. Problem Statement
5. Proposed Solution
6. Goals
7. Visual Mockup & Sketch
The following is my 3 AR Project Ideas:
AR Recycling Sorter
AR Emotion Tracker & Reflection Journal
AR Pet Trainer & Companion
Fig. 3.1 3 AR Project Ideas.
💜Self-Reflection - Proposal
The three AR project ideas focus on enhancing everyday experiences through interactive and educational augmented reality solutions. The AR Recycling Sorter addresses waste sorting confusion by allowing users to scan household items and receive instant visual feedback on the correct recycling bin, promoting sustainable habits. The AR Emotion Tracker & Reflection Journal provides a calming space for emotional reflection, transforming emotions into visual metaphors like floating orbs and offering reflective prompts through AR overlays. Lastly, the AR Pet Trainer & Companion supports new pet owners in basic training routines by projecting a virtual pet guide that demonstrates commands like "sit" or "stay," fostering interactive learning and bonding through step-by-step guidance.
4. FEEDBACK ⊹ ࣪ ˖₊˚⋆˙⟡
Week
1
No feedback given.
Week
2
For our group activity, Mr. Razif reviewed our Sunway Lagoon journey map and suggested that we incorporate more Augmented Reality (AR) experiences into the touchpoints. He encouraged us to think beyond basic solutions and explore how AR could enhance navigation, queuing, interaction, and overall engagement throughout the theme park journey.
Fig. 4.1 Consultation with Mr. Razif.
After consulting with Mr. Razif, he provided positive feedback on my first idea, the AR Recycling Sorter, noting that it is both interesting and meaningful. He suggested that I further develop the concept by adding more detailed features to enhance its functionality and user engagement. Additionally, he mentioned the option to either combine my AR experience with Sin Mi’s idea for a more comprehensive project or to continue refining the AR Recycling Sorter independently. This feedback has given me a clearer direction on how to proceed and the potential to expand the concept further.
Week
3
After discussing with Mr. Razif, he approved our target audience, allowing us to proceed with developing the user personas based on either imagined users or actual interview data. He also emphasized the need to add more detailed features to the document, ensuring a comprehensive breakdown of the AR Recycling Sorter’s functionalities. Additionally, for the user journey map, he suggested we include proposed solutions to address identified pain points, aligning them with the AR experience to provide a more complete and impactful user flow.
Week
4
Mr. Razif provided constructive feedback, noting that our key features are on the right track, and we can proceed as planned. For the user persona, we need to identify the primary and secondary target audiences, focusing on their specific needs and pain points. The empathy map and user journey map were also approved, with no major adjustments required. Regarding the Canva slides, he suggested increasing font size and prioritizing main points, removing unnecessary content to maintain clarity. With these adjustments, we are now ready to proceed with the mockup development, bringing the AR Recycling Sorter to life visually.
5. REFLECTIONS
/ᐠ - ˕ •マ
At the beginning of this module, I was excited but also slightly nervous because I wasn’t sure what kind of “trend” I should explore. There were so many emerging technologies and design platforms that I hadn’t used before. I started this task by immersing myself in market research—scrolling through design inspiration platforms, reading blogs, and checking out what kinds of apps or tools were currently popular in different industries like health, education, and lifestyle.
Through the weekly exercises, I learned not just to follow trends blindly, but to understand the “why” behind their success. I explored how user behavior shapes design direction and how technologies evolve to meet those changing needs. For example, I looked into micro-interactions and AR integrations in mobile apps, noticing how they enhance engagement and emotional connection.
This task also challenged me to critically examine the limitations of certain platforms or trends. For instance, some trends looked visually impressive but had poor usability or accessibility. That pushed me to think beyond surface-level design. The weekly reflections I posted on my E-portfolio helped me track my thinking and growth clearly. I realized how important it is for a designer to be adaptable and informed—what’s popular today might be obsolete tomorrow. This task set the tone for the entire module by laying a solid foundation in design awareness, research habits, and trend evaluation.
⭐ 6/2/2024 - 19/3/2024 (Week 1 - Week 7) 🎀 Yan Zhi Xuan | 0369425 💜 Design Principles | Bachelor of Design (Hons) in Creative Media | Taylor's University 📚 Final Compilation TABLE OF CONTENTS 1. Links to Each Task 2. Submissions / Final Design 3. Final Reflection 1. LINKS TO EACH TASK Design Principles GCD60804 - Task 1: Exploration Design Principles GCD60804 - Task 2: Visual Analysis Design Principles GCD60804 - Task 3: Design Return to the Table of Contents 2. SUBMISSIONS / FINAL DESIGN Fig. 2.1 Final Design. Yan Zhi Xuan_Preserve the Blue_Task 3 Return to the Table of Contents 3. FINAL REFLECTION Respond to the questions below: • What have I learnt in this module? • What did I enjoy the most? • What did I not enjoy the most? • What have I learnt about myself through this module? • What has changed and what has not in my learning journey? • What could be improved in this module...
Comments
Post a Comment