Developing these three artefacts, I have efficiently worked under strict time constraints, discovering innovative methods or enlisting aid to rectify any issues that came my way. Each project bore numerous complications, but my capacity to devise solutions, or learn from others to resolve these challenges, has instilled confidence in my capabilities and newfound skills for future projects. I will now divulge how I came to this realization and the developmental procedures of each project.
The first artefact was the 360 film project. Conceptually, I knew of its existence, but I had never attempted to create a 360 film of my own before. Thankfully, this project was done in pairs, and I partnered with a classmate who had significantly more experience with computer softwares and editing than I did. Under our time constraints, we agreed it was more appropriate to work mostly within our existing individual strengths. Thus, we concurred that she would handle most of the technical editing work in Adobe Aftereffects, and I would manage the narrative and creative direction, including the interactive element of the film editing in Eevo.
When gathering ideas for our narrative, we heeded the assignment brief and decided to base our plot around a futuristic, post-apocalyptic human civilization. I drew inspiration from multiple video games to aid in devising lore for our 360 film world. We designed a foundational narrative before beginning filming, allowing ourselves to rule out areas that would be unsuitable for our world. We were more efficient in filming this way and proceeded onto editing the videos quickly.
In charge of creative and narrative direction, I advised my partner on what to edit for our futuristic, spaceship-like world. We were, however, overly ambitious with our goals, especially considering both of us were unfamiliar with Adobe Aftereffects. We consequently simplified our vision to ensure it was achievable for our skillset, while still establishing our intended immersion in the world we created. As my partner edited, I sketched out a preliminary storyboard to plan exactly how each scene would flow from one to the next for the interactive aspect. I then worked off this storyboard to connect the various scenes on Eevo. However, another issue arose here – we’d both assumed that text could be written directly on scenes in Eevo, but after emailing our tutor to check, I discovered this wasn’t the case. Running out of time, and without Aftereffects on my laptop, I created a slideshow of each scene with the added text, then screen-recorded this to use in Eevo. I understood the brief was to edit in Aftereffects, but this was the most simple and efficient fix for our untimely complication. I feel this demonstrates my ability to overcome unplanned obstacles using the resources at my disposal in an effective way.
Overall, I am content with the final 360 film, but perhaps given more time and experience with Aftereffects and Eevo, the quality of the editing and immersion of the film could’ve been much greater. Though I enjoy narrative direction, for similar future projects, I’d like to tackle the more technical aspects that I am less experienced with to further my knowledge and skills.
Our second artefact was the virtual world-building project. I was both excited and nervous about this: excited to create a virtual Singapore and experience my hometown in a new way, and nervous about using an unfamiliar software, Unity. Despite my reservations, I found this project to be the most enjoyable for creativity and technicality simultaneously; Unity was generally easy to use and understand.
Once again, under our time constraints and my lacking skills in polygon modelling, it didn’t seem feasible to model my Singapore landmarks in Maya. Instead, I opted to download my landmark assets online and edit them in Unity to make them my own.
I faced a few obstacles with this software, despite enjoying it, but each issue was easily remedied with help from my classmates with prior Unity experience, or our tutor. One error now burned in my memory, never to be repeated, is ensuring to save my Unity file as a “project” rather than a “scene”. I mistakenly saved my work as a scene file, and when subsequently loading it, found that the game mode no longer functioned. After much research, fiddling, and assistance from my classmates and tutor, we discovered that the issue was due to the main camera and third-person walker camera having different displays after loading it as a scene. From then on and forevermore, I will ensure to save my Unity work as a project file.
I completed this project quickly and efficiently despite the setbacks and enjoyed doing so. Experiencing childhood landmarks I hold dear in a completely different manner, and being able to share these intimate memories with others, made this project incredibly special. However, if I were to redo it, I would dedicate more time to finer details and the realism of the world, as I feel it would be more immersive that way.
Our final artefact was creating a virtual world avatar in Spark AR. Starting this project behind everyone else due to illness increased pressure, but after browsing through Lenslist for filter inspiration, all those concerns seemed to dissipate. They were instead replaced with an excitement to create something unique and beautiful.
A holographic/iridescent face was the inception of my filter-to-be; it portrayed ethereality and otherwordly beauty. I desired to build upon the holographic face with a mermaid-esque crown to further emanate this impression.
I encountered slight hiccups while constructing my filter, the most prevalent of which was not knowing how to create the holographic face appearance. I engaged my tutor’s support on this, and he suggested I use a variety of different-coloured lighting to create this effect. Gratefully, he was entirely correct; I used this method to create a multicoloured, iridescent facial effect that matched the user’s head movements. He also advised me on how to fit my crown asset perfectly onto the user’s head, as I struggled with this. He suggested I cut the back half off the crown model in Blender, giving the illusion of the crown fitting around one’s head.
Because of his assistance, I was able to create my filter exactly as intended, exuding a surrealistic, mystical beauty. Additionally, I now have a barrage of knowledge on producing certain effects for a filter on Spark AR. I plan to practice these newfound skills and create more filters on Spark AR, as I uncovered such a sense of accomplishment when my filter was functional and published.