Experiential Design / Task 3
26.05.2025 - 30.07.2025 (Week 6 - Week 10)
Tan Sin Mi (0368821)
Experiential Design
Task 3
INSTRUCTIONS
EXERCISES
TASK 3 - PROJECT MVP PROTOTYPE
Loading Page:
We began our app development journey by designing a simple yet effective loading scene. It served as the first touchpoint between the user and the app, displaying the GreenLens logo and setting the tone for the experience ahead. We wanted it to feel quick and purposeful, not just a delay screen. Technically, this part was straightforward, but we encountered a small font rendering issue with Unicode characters in the Poppins-Medium SDF
, which we resolved using fallback characters. Overall, it gave us a solid start and helped establish a clean design standard from the beginning.
![]() |
Figure 1.1 Loading Page |
![]() |
Figure 1.2 AutoSceneLoader.cs |
- Logo: A large, centered GreenLens logo with magnifier icon representing recycling assistance.
- Tagline: “Your Recycling Tutor Assistant” reinforces the app’s function and value.
- Background: Clean green tone, symbolizing sustainability and eco-friendliness.
- Canvas Setup: Uses Unity’s UI Canvas system and is layered above the camera and directional light.
SceneLoader.cs
script to transition to the “Launching” scene after a short delay or animation sequence.Loading Page:
- Purpose: Briefly guides users to use their camera to scan items and receive AR-based sorting instructions.
- Message: “Use your camera to scan any item and get instant AR guidance on the correct recycling bin to use.”
- Interaction: Includes a “Next” button that transitions users to the next tutorial or directly into the Home Page/Scan mode.
- Panel > Background: Holds the main layout and colour background.
- Icon A–C: Animated or static illustrations representing the scanning process (recyclable bin with a magnifying glass).
- NextBtn: Interactive button component styled with TextMesh Pro, connected to navigation logic.
- Title & Body Text: Centre-aligned onboarding message styled with SDF fonts for crisp display.
Home Page:
The Home Page serves as the main dashboard of the GreenLens AR app. It welcomes users and provides quick insights into their recycling impact, as well as easy access to other core features.
Header Section:
- User Greeting: Personalized with the user’s name (e.g., “Hi, Yan”).
- Avatar/Character Icon: Adds a fun, gamified feel to the page.
Score Stats Panel:
- Coins Earned: Tracks gamified rewards earned through recycling.
- CO₂ Saved: Displays how much carbon dioxide (CO₂) has been reduced by recycling.
- Scan Activity Icon: A visual indicator tied to scan usage or progress.
Info Section:
- Text reads: “How much carbon dioxide (CO₂) emission you can avoid from recycling different materials?”
- Tapping the info button opens the Study Scene (as seen in the
OnClick()
event in the Button component).
Scan Button:
Direct call-to-action: “Scan Item” button for initiating the item recognition feature.
Bottom Navigation Bar: Home | Game | Scan | Bin Finder | Settings
Study Page:
The Study Scene is an educational page designed to help users understand the science behind recycling and how it translates to carbon savings.
Features:
- Material Breakdown Table: Shows average CO₂ savings per kg of recycled material:
- Aluminum: ~10.9 kg CO₂ saved
- Plastic: ~3.5 kg CO₂
- Paper: ~4.3 kg CO₂
- Glass: ~0.3 kg CO₂
- Steel: ~1.2 kg CO₂
- Formula Section: CO₂ Saved = Weight of Material × CO₂ Factor (Includes a user-friendly example for better comprehension.)
- Navigation Bar: Includes access to other parts of the app (Home, Scan, etc.) for seamless movement between learning and action.
Summary Page:
The summary scene gave us the opportunity to reinforce positive feedback and celebrate user achievements. After scanning or recycling an item, users see their updated stats and a motivational message—like saving enough energy to light an LED bulb. We focused on making the layout feel rewarding without overwhelming the user. The buttons to scan another item or return home were designed for seamless flow. This scene tied the experience together and made users feel that their actions had value. It was fulfilling to build something that could potentially boost eco-friendly habits.
The Summary Scene provides immediate feedback after a user scans or sorts an item, offering a sense of achievement and reinforcing positive behavior.
Features:
- Summary Stats: Displays key recycling metrics (Coins Earned, Items Recycled, CO₂ Saved)
- Achievement Message: Example: "You’ve saved enough energy to light an LED bulb for 6 hours!" This adds a tangible, real-world context to the environmental impact.
- Navigation Buttons:
- Scan Another Item – loops user back to the scanning scene.
- Back to Home – returns user to the dashboard/home page.
Game Scene:
For Task 3, we haven’t started coding yet, but we focused on exploring 3D model compatibility and customization in Unity. Our main goal was to test different file formats (such as .fbx
) and successfully import them into the scene. We experimented with changing the color of the bins and applying textures or icons like the recycling symbol. This hands-on process helped us understand how materials, shaders, and mesh renderers work in Unity. Although there's no gameplay logic yet, this stage was important for setting up the visual foundation of the game and preparing us for the next phase, which will involve scripting interactions like drag-and-drop and scoring.
UI Components Built:
Instruction Panel: Provides sorting guidance:
- 🔵 Blue – Paper
- 🟠Orange – Plastic & Metal
- 🟤 Brown – Glass
- ⚫ Black – General Waste
- Score System: A placeholder
ScoreNum
andScore
counter are included for future logic. - Navigation Bar: Includes buttons for Home, Game, Bin Finder, and Settings.
3D Models Imported:
- Several trash bin models have been added in
.fbx
format. - Materials and textures (like
RecycleBinTexture
) were applied successfully. - Bin colors were changed using materials (e.g.,
ColourBin4
), with different bins assigned their designated hues.
Although the game scene is still in progress, we’ve learned a lot from preparing its 3D elements. We imported bin models in .fbx format and experimented with applying materials, changing colors, and placing textures such as the recycling icon. Getting the material layering right was initially confusing, but we managed to assign separate materials to different mesh parts. While no interactivity has been added yet, this stage helped us understand Unity’s 3D environment better and laid the foundation for drag-and-drop functionality. It’s exciting to see the gameplay space take shape visually.
Fourth Stage of Functioning Prototype Development in Unity:
3D Scanner - Polycam:
Canvas
. I created key elements such as the Scan UI, Info Cards (Plastic, Paper, Glass), and integrated buttons like “Next” and “Back” to support scene navigation. The scene was structured inside a parent GameObject named ARScan
, organizing all essential GameObjects for camera, UI, and AR targets.I installed the Vuforia Engine through Unity’s Package Manager to enable AR functionality. After integration, I used the Vuforia Model Target Generator (MTG) to upload and train 3D models such as a plastic bottle, dropper bottle, and glass serum bottle. These models were imported in both .obj
and .fbx
formats to test their tracking compatibility.
Within MTG, I created Model Targets for each material category — Plastic, Paper, and Glass — and configured a Guide View for each object to help the ARCamera recognize the model from a specific angle. Once training was complete, I imported the generated datasets into Unity to activate real-world tracking.
In Unity, I configured the ARCamera and placed the Model Targets into the scene. I also positioned the corresponding 3D models in the scene to test detection accuracy and alignment during runtime.
For the animation system in the AR Scan scene, I created separate Animator Controllers for each material type (Plastic, Paper, Glass). For example, the PlasticAnimator.controller was designed with two key states: PlasticSlideIn
and PlasticSlideOut
. I used Unity’s Animator window to define the flow of these states, where the animation begins with PlasticSlideIn
when a plastic object is detected, and transitions to PlasticSlideOut
when the object is no longer recognized. I organized the GameObject hierarchy so each material's info card has its own Animator and attached the correct controller. I linked these animations to the ModelTargetEvent.cs script, which listens for the object's tracking status and plays the appropriate animation using SetTrigger()
methods. This setup allowed the information card UI to appear smoothly with a sliding motion when an object is scanned and hide again when the object is removed from view. Each material has its own unique animator and animation clips to ensure smooth and responsive interaction.
Asset Building / Collection:
While preparing our Unity project for build and run, we encountered an issue that prevented the app from launching properly. With the help of Mr. Razif, we discovered that the problem was related to missing configurations in the Player settings. Specifically, we needed to enable "Render Over Native UI" under the iOS resolution and presentation settings. Additionally, we learned that an Event System was required in each scene to ensure that UI elements such as buttons could register input. These were small but crucial steps we had overlooked, and solving them gave us a better understanding of Unity’s build requirements and scene setup. It was a valuable debugging moment that strengthened our confidence in deploying the app correctly.
What's In Progress / Coming Soon:
FEEDBACK
Figure 4.1 Online Consultation with Mr. Razif |
REFLECTION
Comments
Post a Comment