Immersive Visual Effectsmodule

Table of Contents
Module Information
Lecture
Week 1 Introduction to Visual EffectsOn our first day, Mr Fauzi gave us a brief introduction to the module and told us what to expect for the current semester. He gave us a brief introduction on the module information booklet and discussed the weekly lesson plan. Besides, assignment, assessment, and weightage were also discussed.
Week 2 Virtual Dolly and Camera Tracking This week’s lecture taught us how to use the camera tool in After Effects to create cool animations. The tutorial video showed us how to move the camera around to make the animation look more dynamic and fun. I tried using this technique together with my exercise to see how it works.
Week 3 Zenith & NadirThe lecture video clearly explained how the observer’s horizon is related to the celestial sphere. But the most important part is understanding Zenith and Nadir.
Zenith is the point in the sky right above your head. It’s the highest point you can see in the sky.Nadir is the point directly under your feet, on the opposite side of the zenith. It’s the lowest point.In short, zenith is the top, and nadir is the bottom.
Week 4 Compositing HDRI Image HDRI (High Dynamic Range Image) is a digital image format used to enhance photos by balancing brightness and shadows, making them look more natural. It is commonly created with photo editing software like Photoshop.
Advantages: - Enhances and customises images for better results. - Compatible with many photo editing programmes. - Easy to create using Photoshop.
Disadvantages: - 32-bit HDRI files are large and limit some editing and sharing features in Photoshop. - Must be converted to 16 or 8-bit to fully access Photoshop’s tools.
Figure 2.2 Steps to Create HDRI Files
This video showed us how to use an app called HDReye to create HDRI images. We had the chance to try it out during class. However, exporting the images requires payment, as the app is not free for that feature.
Week 5 Animation Techniques In this week's lecture, we were introduced to the Roto Brush tool in After Effects. This tool allows users to separate a subject from its background in a video, similar to how a green screen works. Instead of manually masking frame by frame, the Roto Brush lets us paint over the subject, and it automatically tracks the edges over time. This makes it much easier and faster to isolate people or objects for effects, background changes, or compositing.
Besides that, we also had a tutorial on how to use Track Matte in After Effects, which is a technique used to control the visibility of one layer based on the transparency or shape of another layer. For example, it allows users to create effects where text or images only appear within certain shapes or moving objects, giving a more dynamic and professional look to animations.
Generating masks and alpha mattes, rotoscoping and 2D tracking 1. Rotoscoping: https://youtu.be/U0bZEQMdCsg 2. Rotoscoping: https://www.youtube.com/shorts/rIbTVuzfnhE 3. 2D Tracking: https://youtu.be/tqWWhChc8RA 4. Alpha Matte: https://youtu.be/zgbXGqYMxq4
Week 6 Keying overviewWe were taught how to remove a green screen in After Effects. We learned to use the Keylight effect to cleanly remove the green background and make the subject blend smoothly into a new background. This technique is useful for creating more professional and creative video compositions.Steps: 1. Import the video or image with a green screen into After Effects and turn it into a composition. 2. Go to Effects & Presets and search for Keylight. 3. Drag the Keylight effect onto the video or image layer. 4. Use the eyedropper tool to select the green colour in the background — this will automatically remove the green screen. 5. Turn on the Transparency Grid to check if the background is properly removed.
Week 7 Compositing 3D Element Design
Steps:
1. Import the Footage - Go to File > Import to bring your video clip into After Effects. - Drag the footage onto the timeline. 2. Open the Tracker Panel - Go to Window > Tracker to open the Tracker panel. 3. Select the Layer to Track - Click on your footage layer in the timeline. - In the Tracker panel, click Track Motion. 4. Choose Tracking Type - By default, it selects position tracking. - You can also enable Rotation or Scale if needed. 5. Set the Tracking Point - A tracking box will appear. - Move it to a spot in the video with high contrast (e.g., a corner or edge). - The inner box is the area to track; the outer box is the search area. 6. Start Tracking - Click the Play button (Analyse Forward) in the Tracker panel. - After Effects will follow the movement of that point frame by frame. 7. Apply the Tracking Data - Create a new Null Object (Layer > New > Null Object). - In the Tracker panel, click Edit Target and choose the Null Object. - Click Apply to apply the tracking data. 8. Attach Elements to the Tracking Data - To attach text, images, or graphics: - Parent your layer (e.g., text or object) to the Null Object using the pick whip. - Now the element follows the tracked movement.
Week 9 Post Production Design PrinciplesGo through the link below:
Project
Week 1 - Week 2Before starting Project 1, we were given a small exercise to create a basic animation using After Effects. Mr. Fauzi provided us with a template so we could understand how the animation was built. Our task was to explore the template and then create our own simple animation based on it. The animation was required to be around 10 seconds long, and we were encouraged to add background music to make it more engaging.
To begin with, I recreated the animation that Mr. Fauzi demonstrated. This helped me better understand how everything worked before trying to create my own version. At first, I was confused by the Controller and didn’t really understand its purpose. After experimenting with it, I realised that the Controller is actually a Null Object.
A null object is very useful because it allows multiple assets to be grouped together and controlled at the same time. Instead of adjusting each element individually, I could simply animate the null object and apply the same movement or transition to all the linked assets. Once I understood this, the animation process became much smoother and more efficient.
Using VE, we developed visual elements such as flowing light, butterflies, and heart-shaped particles to represent invisible emotions like connection, warmth, and empathy. Special attention was given to the movement, timing, and transparency of these elements so that they would appear soft and organic when projected onto the wooden hands.
Youtube Link:https://youtu.be/46pX8AOWo04
Google drive Link:https://drive.google.com/drive/folders/1I4x0nnM2TrxBHaVX1qC5-xj-Jx5TIxY9
Figure 7.3 Resolume #3
Figure 7.4 Resolume #4
Figure 7.5 Resolume #5This process required a high level of precision, as even small misalignments could break the sense of immersion. Through repeated testing and fine-tuning, we adjusted the position, scale, and masking of the visuals to better match the physical form. This experience helped me understand the importance of accuracy in projection mapping and how technical details play a crucial role in maintaining the emotional impact of an immersive installation.Week 10–11 – Interaction Development with TouchDesigner
During this stage, we began developing interactive elements for the installation. After consulting with the lecturer, we decided to experiment with TouchDesigner to create interactive responses between the audience and the projected visuals.
As this was our first time using TouchDesigner, the learning curve was quite steep. We started by focusing on basic interactions, exploring how user input could trigger visual changes such as the appearance and movement of light particles and butterflies. Most of this phase involved testing, troubleshooting, and understanding the logic-based workflow of the software.
Figure 8.1 TouchDesigner Progress #1
Figure 8.2 TouchDesigner Progress #2As this was our first time using TouchDesigner, the learning curve was quite steep. We started by focusing on basic interactions, exploring how user input could trigger visual changes such as the appearance and movement of light particles and butterflies. Most of this phase involved testing, troubleshooting, and understanding the logic-based workflow of the software.
Figure 8.3 TouchDesigner Progress #3
Figure 8.4 TouchDesigner Progress #4
The final video
Although challenging, this process helped us understand the potential of interactive media in immersive installations. It also highlighted how audience participation can transform a static visual experience into a shared and emotionally engaging one.
Week 10–11 – Interaction Development with TouchDesigner
During this stage, we began developing interactive elements for the installation. After consulting with the lecturer, we decided to experiment with TouchDesigner to create interactive responses between the audience and the projected visuals.
As this was our first time using TouchDesigner, the learning curve was quite steep. We started by focusing on basic interactions, exploring how user input could trigger visual changes such as the appearance and movement of light particles and butterflies. Most of this phase involved testing, troubleshooting, and understanding the logic-based workflow of the software.
As this was our first time using TouchDesigner, the learning curve was quite steep. We started by focusing on basic interactions, exploring how user input could trigger visual changes such as the appearance and movement of light particles and butterflies. Most of this phase involved testing, troubleshooting, and understanding the logic-based workflow of the software.
Youtube Link:https://youtube.com/shorts/BpDXJ39CyqU
Google drive Link:https://drive.google.com/drive/folders/1I4x0nnM2TrxBHaVX1qC5-xj-Jx5TIxY9
Week 12 Alternative Interaction & QR Code SystemIn Week 12, we explored an additional interactive method by creating a web-based interaction. A QR code was prepared, allowing audiences to interact with the installation by scanning it on their mobile devices.By this stage, we planned to provide two interaction options: - TouchDesigner-based interaction - Web interaction via QR codeTouchDesigne
Figure 12.1 TouchDesigner-based interaction#1
Figure 12.2 TouchDesigner-based interaction#2
The addition of gesture interaction enhanced the sense of connection between the viewer and the installation. Visual elements such as light particles and butterflies responded more dynamically to user movements, reinforcing the idea of emotional energy flowing through human actions. This stage deepened my understanding of how interactive technology can strengthen engagement and transform viewers from passive observers into active participants.Web interaction via QR codeTo further extend audience participation, we incorporated a web-based interaction through a QR code. Viewers were invited to scan the QR code using their mobile devices, which allowed them to interact with the installation in real time. This approach enabled interaction beyond the physical space, connecting personal digital devices with the projected visuals. QR code
To host the web-based interaction linked to the QR code, we used Netlify as the deployment platform. Netlify allowed us to quickly publish and manage the interactive webpage, making it easily accessible to users through their mobile devices.
By deploying the interaction online, the QR code could direct the audience to a stable and responsive web interface, ensuring a smooth user experience during the installation. This approach also simplified updates and testing, allowing us to refine the interaction efficiently throughout the development process.
Web interaction via QR code link:https://merry-pithivier-2fb07a.netlify.app/This approach offered flexibility and ensured that audience participation would still be possible even if one system encountered technical issues.Week 13: Resolume Masking & Final In Week 13, we focused on finalising the projection masking in Resolume Arena, refining alignment through the projector, and finallise final video. We also documented the installation process by taking photos and videos for behind-the-scenes records and blog documentation.
Figure 10.1 Resolume Masking #1
Final Submission
The final submission presented a complete and cohesive outcome developed through Projects 1 to 3. It included the compiled proposal, work-in-progress documentation, behind-the-scenes materials, and the final video output. All supporting materials such as mood boards, storyboards, sketches, and technical process documentation were organised and presented in slide format.
Light of Connection Final Presentation
The final submission presented a complete and cohesive outcome developed through Projects 1 to 3. It included the compiled proposal, work-in-progress documentation, behind-the-scenes materials, and the final video output. All supporting materials such as mood boards, storyboards, sketches, and technical process documentation were organised and presented in slide format.
Light of Connection Final Presentation
Youtube Link:https://youtu.be/BMpce_OfwsY?si=XqF-NToXORMDFJvj
Google drive Link:https://drive.google.com/drive/folders/1I4x0nnM2TrxBHaVX1qC5-xj-Jx5TIxY9
The final video demonstrated the successful integration of physical elements, visual effects, projection mapping, and interactive design. Audience interaction through gesture-based controls, QR code web interaction, and TouchDesigner-driven visuals contributed to an immersive and emotionally engaging experience.
The final video demonstrated the successful integration of physical elements, visual effects, projection mapping, and interactive design. Audience interaction through gesture-based controls, QR code web interaction, and TouchDesigner-driven visuals contributed to an immersive and emotionally engaging experience.
Feedback
The feedback received from the lecturer highlighted the strength of the project’s concept and its emotional clarity. The use of physical wooden hands combined with projected visual effects was seen as an effective way to communicate the theme of human connection. The integration of interaction, particularly through TouchDesigner and QR code-based web interaction, was also recognised as a strong element that enhanced audience engagement.
At the same time, the feedback pointed out areas for improvement, especially in refining the precision of projection mapping and improving the consistency of interactive responses. These comments were valuable, as they helped identify technical details that could further strengthen the immersive quality of the installation.
Reflection
This project was a meaningful learning experience that allowed me to explore the relationship between emotion, technology, and interaction. From concept development to final submission, I learned that strong ideas require both creative vision and technical discipline to be effectively communicated.
One of the biggest challenges was learning new tools such as TouchDesigner and integrating them with projection mapping and web-based interaction. Although the learning curve was steep, the process helped me develop problem-solving skills and a deeper understanding of interactive media design.
Overall, the project reinforced the idea that visual effects are not only about creating impressive visuals, but also about conveying emotions and creating shared experiences. This experience has strengthened my confidence in combining physical elements with digital technology, and it has influenced how I approach immersive and interactive design in future projects.




.jpg)





























Comments
Post a Comment