Experiential Design / Task 3

26.05.2025 - 30.07.2025 (Week 6 - Week 10)

Tan Sin Mi (0368821)

Experiential Design

Task 3


CONTENT

INSTRUCTIONS



EXERCISES



TASK 3 - PROJECT MVP PROTOTYPE

After done the final wireframe and mockups, we are able to start coding the apps in unity. We decided to do this app based on the iPhone 14 Pro Max size.

First Stage of Functioning Prototype Development in Unity


Loading Page:

We began our app development journey by designing a simple yet effective loading scene. It served as the first touchpoint between the user and the app, displaying the GreenLens logo and setting the tone for the experience ahead. We wanted it to feel quick and purposeful, not just a delay screen. Technically, this part was straightforward, but we encountered a small font rendering issue with Unicode characters in the Poppins-Medium SDF, which we resolved using fallback characters. Overall, it gave us a solid start and helped establish a clean design standard from the beginning.

Figure 1.1 Loading Page

 Figure 1.2 AutoSceneLoader.cs

The Loading Page is the introductory splash screen for the GreenLens application. It provides an initial branding experience before users are transitioned into the interactive sections of the app.

Purpose: To display the app logo and tagline briefly while backend assets initialize. Acts as a visual buffer to ensure a smooth transition to the Launching or Home scene.

UI Elements:
  • Logo: A large, centered GreenLens logo with magnifier icon representing recycling assistance.
  • Tagline: “Your Recycling Tutor Assistant” reinforces the app’s function and value.
  • Background: Clean green tone, symbolizing sustainability and eco-friendliness.
  • Canvas Setup: Uses Unity’s UI Canvas system and is layered above the camera and directional light.
Scene Transition: This page is loaded at app start and will call the SceneLoader.cs script to transition to the “Launching” scene after a short delay or animation sequence.


Loading Page:

We have the launching scene, as it functioned like an onboarding tutorial for users. We used Illustrator to design clean vector assets and exported them as SVGs to keep the file lightweight. The challenge was ensuring the visuals were simple enough to understand while still being engaging. This scene helped us bridge design and functionality, setting up users with the confidence to start scanning and sorting waste using AR.


Fig. 3.6 Launching Scene and SceneManager.cs

The Launching Scene acts as the onboarding screen of our AR Recycling Sorter app. Built in Unity, this scene introduces users to the purpose of the app with a clean and friendly visual style.

Scene Functionality:
  • Purpose: Briefly guides users to use their camera to scan items and receive AR-based sorting instructions.
  • Message: “Use your camera to scan any item and get instant AR guidance on the correct recycling bin to use.”
  • Interaction: Includes a “Next” button that transitions users to the next tutorial or directly into the Home Page/Scan mode.

UI Elements:
  • Panel > Background: Holds the main layout and colour background.
  • Icon A–C: Animated or static illustrations representing the scanning process (recyclable bin with a magnifying glass).
  • NextBtn: Interactive button component styled with TextMesh Pro, connected to navigation logic.
  • Title & Body Text: Centre-aligned onboarding message styled with SDF fonts for crisp display.



Fig. 3.7 Other Scripts for Launching Scene.

Fig. 3.8 Sketchfab / Unity plugin.

I found that importing 3D models from Sketchfab directly through the Unity plugin is a fast and efficient way to enhance AR experiences, especially when I'm prototyping or adding quick visual assets. It automatically brings in the model with textures and materials, saving me a lot of setup time. However, when I tried importing manually using FBX, I realized I had to manually assign the materials and textures, which was quite tedious. I learned that using glTF format is a better option because it preserves the textures and hierarchy more accurately in Unity. This process taught me how important it is to choose the right file format when working with 3D assets in AR projects.

Second Stage of Functioning Prototype Development in Unity


Home Page:

During Week 8, we designed the home scene allowed us to experiment with user personalization and real-time feedback. We incorporated welcoming text, user stats like coins earned and CO₂ saved, and a visually engaging interface. One of our priorities was creating a layout that feels both informative and rewarding. We encountered some difficulty with spacing and alignment across devices, but we adjusted the canvas settings to handle different screen sizes. Integrating the navigation bar made the scene feel more complete. This scene became our central hub and helped us understand how to balance design with dynamic content needs.

Fig. 3.9 Home Scene.

The Home Page serves as the main dashboard of the GreenLens AR app. It welcomes users and provides quick insights into their recycling impact, as well as easy access to other core features.

Header Section:

  • User Greeting: Personalized with the user’s name (e.g., “Hi, Yan”).
  • Avatar/Character Icon: Adds a fun, gamified feel to the page.

Score Stats Panel:

  • Coins Earned: Tracks gamified rewards earned through recycling.
  • CO₂ Saved: Displays how much carbon dioxide (CO₂) has been reduced by recycling.
  • Scan Activity Icon: A visual indicator tied to scan usage or progress.

    Info Section:

    • Text reads: “How much carbon dioxide (CO₂) emission you can avoid from recycling different materials?”
    • Tapping the info button opens the Study Scene (as seen in the OnClick() event in the Button component).

      Scan Button:

      • Direct call-to-action: “Scan Item” button for initiating the item recognition feature.

      Bottom Navigation Bar: Home | Game | Scan | Bin Finder | Settings

      Study Page:

      We developed the study scene to support user education by explaining the environmental impact of recycling. Presenting scientific data in a digestible way was important to us, so we created a table showing average CO₂ savings for different materials and included a simple formula users could understand. It was a rewarding challenge to simplify technical information while maintaining accuracy. By connecting learning to in-app actions, we hope to enhance user motivation and give deeper meaning to their recycling efforts. This scene turned out to be a strong informational anchor for the app.

      Fig. 3.10 Study Scene.

      The Study Scene is an educational page designed to help users understand the science behind recycling and how it translates to carbon savings.

      Features:

      • Material Breakdown Table: Shows average CO₂ savings per kg of recycled material:
        • Aluminum: ~10.9 kg CO₂ saved
        • Plastic: ~3.5 kg CO₂
        • Paper: ~4.3 kg CO₂
        • Glass: ~0.3 kg CO₂
        • Steel: ~1.2 kg CO₂
      • Formula Section: CO₂ Saved = Weight of Material × CO₂ Factor (Includes a user-friendly example for better comprehension.)
      • Navigation Bar: Includes access to other parts of the app (Home, Scan, etc.) for seamless movement between learning and action.

      Summary Page:

      The summary scene gave us the opportunity to reinforce positive feedback and celebrate user achievements. After scanning or recycling an item, users see their updated stats and a motivational message—like saving enough energy to light an LED bulb. We focused on making the layout feel rewarding without overwhelming the user. The buttons to scan another item or return home were designed for seamless flow. This scene tied the experience together and made users feel that their actions had value. It was fulfilling to build something that could potentially boost eco-friendly habits.

            Fig. 3.11 Study Scene.

            The Summary Scene provides immediate feedback after a user scans or sorts an item, offering a sense of achievement and reinforcing positive behavior.

            Features:

            • Summary Stats: Displays key recycling metrics (Coins Earned, Items Recycled, CO₂ Saved)
            • Achievement Message: Example: "You’ve saved enough energy to light an LED bulb for 6 hours!" This adds a tangible, real-world context to the environmental impact.
            • Navigation Buttons: 
              • Scan Another Item – loops user back to the scanning scene. 
              • Back to Home – returns user to the dashboard/home page.
            UI Setup: Elements organised under Canvas > SummaryStats and Box, with all values using TextMesh Pro for clean display.

            Third Stage of Functioning Prototype Development in Unity

            Game Scene:

            For Task 3, we haven’t started coding yet, but we focused on exploring 3D model compatibility and customization in Unity. Our main goal was to test different file formats (such as .fbx) and successfully import them into the scene. We experimented with changing the color of the bins and applying textures or icons like the recycling symbol. This hands-on process helped us understand how materials, shaders, and mesh renderers work in Unity. Although there's no gameplay logic yet, this stage was important for setting up the visual foundation of the game and preparing us for the next phase, which will involve scripting interactions like drag-and-drop and scoring.

            Fig. 3.12 Game Scene.

            The Game3D Scene is currently under construction, serving as an interactive game space where users can drag and drop waste items into the correct recycling bins. It aims to educate users on waste sorting through an engaging and visual experience.

            [Current Setup]

            UI Components Built:

            • Instruction Panel: Provides sorting guidance:

              • 🔵 Blue – Paper
              • 🟠 Orange – Plastic & Metal
              • 🟤 Brown – Glass
              • ⚫ Black – General Waste
            • Score System: A placeholder ScoreNum and Score counter are included for future logic.
            • Navigation Bar: Includes buttons for Home, Game, Bin Finder, and Settings.

                  Fig. 3.13 Game Scene 3D Assets.

                  3D Models Imported:

                  • Several trash bin models have been added in .fbx format.
                  • Materials and textures (like RecycleBinTexture) were applied successfully.
                  • Bin colors were changed using materials (e.g., ColourBin4), with different bins assigned their designated hues.

                  Although the game scene is still in progress, we’ve learned a lot from preparing its 3D elements. We imported bin models in .fbx format and experimented with applying materials, changing colors, and placing textures such as the recycling icon. Getting the material layering right was initially confusing, but we managed to assign separate materials to different mesh parts. While no interactivity has been added yet, this stage helped us understand Unity’s 3D environment better and laid the foundation for drag-and-drop functionality. It’s exciting to see the gameplay space take shape visually.

                  Fourth Stage of Functioning Prototype Development in Unity:


                  3D Scanner - Polycam:


                  Fig. 3.14 3D Scanner Apps.

                  During our AR exploration, Mr. Razif recommended that we use Model Targets in Vuforia Engine for better object recognition, especially for items with distinct shapes like bottles or cups. He explained that Model Targets allow for full 360° tracking and are more stable compared to image-based tracking when working with 3D objects. To generate these models, he also suggested using Polycam or other 3D scanning tools to capture real-world objects digitally.
                   

                  Fig. 3.15 Scanning with Polycam.

                  We followed his advice and scanned several items including a Coca-Cola bottle, a Shell Café cup, and a vitamin bottle using Polycam. While some results were successful, others lacked clean mesh or accurate textures, especially for items with shiny or curved surfaces. This experience helped us understand the importance of model quality and scanning technique. 



                  Fig. 3.16 Some Scanned Item Outcomes.

                  Some models turned out decent, others—like the crumpled or reflective surfaces—produced less accurate results. This was especially noticeable in the Shell cup, where the mesh and texture appeared distorted or incomplete. We realized that lighting, camera angle, and object texture significantly affect the outcome. Despite these limitations, the process gave us a better understanding of photogrammetry and the importance of choosing the right object type for reliable recognition and tracking in AR environments.

                  AR Scan Scene (My Work-on Feature):

                  Fig. 3.17 AR Scan Scene.

                  To begin the AR Scan scene, I built the foundational scene setup in Unity, starting with the UI layout under the Canvas. I created key elements such as the Scan UI, Info Cards (Plastic, Paper, Glass), and integrated buttons like “Next” and “Back” to support scene navigation. The scene was structured inside a parent GameObject named ARScan, organizing all essential GameObjects for camera, UI, and AR targets.


                  Fig. 3.18 Vuforia Engine Model Target Generator.

                  I installed the Vuforia Engine through Unity’s Package Manager to enable AR functionality. After integration, I used the Vuforia Model Target Generator (MTG) to upload and train 3D models such as a plastic bottle, dropper bottle, and glass serum bottle. These models were imported in both .obj and .fbx formats to test their tracking compatibility.

                  Within MTG, I created Model Targets for each material category — Plastic, Paper, and Glass — and configured a Guide View for each object to help the ARCamera recognize the model from a specific angle. Once training was complete, I imported the generated datasets into Unity to activate real-world tracking.

                  In Unity, I configured the ARCamera and placed the Model Targets into the scene. I also positioned the corresponding 3D models in the scene to test detection accuracy and alignment during runtime.



                  Fig. 3.19 Some Scanned Item Outcomes.

                  I found the 3D model assets from Freepik and CGTrader. I then imported these FBX. files to Unity ARScan Scene. Each object has its own: AR model Detection script, and a custom InfoCard prefab that slides in when scanned.


                  Fig. 3.20 Info Card and Animator,

                  For the animation system in the AR Scan scene, I created separate Animator Controllers for each material type (Plastic, Paper, Glass). For example, the PlasticAnimator.controller was designed with two key states: PlasticSlideIn and PlasticSlideOut. I used Unity’s Animator window to define the flow of these states, where the animation begins with PlasticSlideIn when a plastic object is detected, and transitions to PlasticSlideOut when the object is no longer recognized. I organized the GameObject hierarchy so each material's info card has its own Animator and attached the correct controller. I linked these animations to the ModelTargetEvent.cs script, which listens for the object's tracking status and plays the appropriate animation using SetTrigger() methods. This setup allowed the information card UI to appear smoothly with a sliding motion when an object is scanned and hide again when the object is removed from view. Each material has its own unique animator and animation clips to ensure smooth and responsive interaction.


                  Fig. 3.21 CardAnimationController.cs and ModelTargetEvent.cs,

                  For animations, I also built a dedicated script called CardAnimationController.cs and later refactored it to include SetActive(true/false) logic, ensuring the cards appeared and disappeared smoothly based on tracking status. To connect Vuforia’s tracking feedback with UI card activation,

                  I wrote a new script, ModelTargetEvent.cs, using an enum to detect different material types. I used ObserverBehaviour.OnTargetStatusChanged to trigger animation methods from CardAnimationController.cs accordingly. This allowed each material card (plastic, paper, or glass) to animate only when the correct model was detected. Following Mr. Razif’s advice, I tested various 3D models exported from Freepik, CGTrader, and Polycam to ensure compatibility and visibility of object features for better tracking. He also encouraged using Model Target instead of Image Target for stability and real object detection. Finally, I built the animation system into Unity by connecting the scripts, animators, and model target events. I tested interactions using Unity's Play Mode and refined the script to make sure each card worked independently based on the recognized object.

                  Last Stage of Task 3

                  Asset Building / Collection:

                  Fig. 3.22 Font

                  Fig. 3.23 Scenes.

                  Fig. 3.24 Scripts.

                  Fig. 3.25 UI Elements.

                  Fig. 3.26 3D Assets.

                  Fig. 3.27 Animations.

                  We maintained a clean folder structure in Unity for better asset management. All 3D models were stored in the 3DModel folder, animations in Animation, UI graphics in UIelements, and logic in Scripts. This organization helped during debugging and testing, especially when building the app for iOS using Unity 6.0.

                  At this stage, we have not fully completed the coding for Task 3, but we’ve made foundational progress by experimenting with 3D model compatibility and Vuforia integration. We've established key systems for animation and scene transitions, and confirmed that our structure is scalable for future tasks such as scoring, quizzes, or data logging. Overall, this phase helped us better understand how AR object tracking works, how Unity’s Animator integrates with real-time events, and how essential clean asset management is in a multi-scene project.

                  Trying to Build and Run for launching the app on the iPhone:

                  Fig. 3.28 "Render Over Native UI"

                  While preparing our Unity project for build and run, we encountered an issue that prevented the app from launching properly. With the help of Mr. Razif, we discovered that the problem was related to missing configurations in the Player settings. Specifically, we needed to enable "Render Over Native UI" under the iOS resolution and presentation settings. Additionally, we learned that an Event System was required in each scene to ensure that UI elements such as buttons could register input. These were small but crucial steps we had overlooked, and solving them gave us a better understanding of Unity’s build requirements and scene setup. It was a valuable debugging moment that strengthened our confidence in deploying the app correctly.

                  What's In Progress / Coming Soon:

                  Fig. 3.29 What has been done and what still needs work.


                  Fig. 3.29 Work Distribution.

                  Final Submission: 
                  1. Prototype Walkthrough Video: Google Drive / YouTube
                  2. Presentation Slide: Canva Slide
                  3. Presentation Video: Google Drive / YouTube
                  4. Google Drive folder: Link

                  GreenLens Task 3 Prototype Walkthrough Video:


                  GreenLens Task 3 Presentation Recording with the Prototype Walkthrough Video:


                  GreenLens Task 3 Presentation Slide:



                    FEEDBACK

                    Figure 4.1 Online Consultation with Mr. Razif

                    Week 5:
                    Mr. Razif mentioned that the first version of the logo needs revision — some text should be enlarged, and additional spacing is needed to avoid crowding, making the layout more spacious and user-friendly.

                    Week 6:
                    Mr. Razif has approved the new brand logo. As for the visual elements on the game page, he said there's no need to overly decorate them because the design is meant to interact with real-world spaces. He also suggested adding a "Pause Game" icon and a gameplay demonstration.

                    Week 7:
                    Mr. Razif said the colours we used are fine. He also told us not to stress too much about whether the code works at this stage — just focus on researching and organising what we currently have. Even if the outcome isn’t successful, it’s okay. There will be more lessons later on where we can learn and apply the knowledge to future projects.

                    Week 8 & Week 9:
                    Mr. Razif advised us to focus on developing the MVP (Minimum Viable Product) flow for our AR project and bring it to a functioning prototype stage. He also suggested that we refer to our seniors' blog documentation for guidance and inspiration on how to structure our process, showcase key features, and present our development clearly.

                    Week 10:
                    No feedback given.


                    REFLECTION


                    Comments

                    Popular posts from this blog

                    Application Design / Project 2

                    Experiential Design / Task 2

                    Experiential Design / Task 1