https://docs.google.com/document/d/1Bli95B6Bhru7O30ukRyyUzprBkMlH8l7DI2AuC9vtwY/edit
The link contains videos of each artefact in use.
https://docs.google.com/document/d/1Bli95B6Bhru7O30ukRyyUzprBkMlH8l7DI2AuC9vtwY/edit
The link contains videos of each artefact in use.
Developing these three artefacts, I have efficiently worked under strict time constraints, discovering innovative methods or enlisting aid to rectify any issues that came my way. Each project bore numerous complications, but my capacity to devise solutions, or learn from others to resolve these challenges, has instilled confidence in my capabilities and newfound skills for future projects. I will now divulge how I came to this realization and the developmental procedures of each project.
The first artefact was the 360 film project. Conceptually, I knew of its existence, but I had never attempted to create a 360 film of my own before. Thankfully, this project was done in pairs, and I partnered with a classmate who had significantly more experience with computer softwares and editing than I did. Under our time constraints, we agreed it was more appropriate to work mostly within our existing individual strengths. Thus, we concurred that she would handle most of the technical editing work in Adobe Aftereffects, and I would manage the narrative and creative direction, including the interactive element of the film editing in Eevo.
When gathering ideas for our narrative, we heeded the assignment brief and decided to base our plot around a futuristic, post-apocalyptic human civilization. I drew inspiration from multiple video games to aid in devising lore for our 360 film world. We designed a foundational narrative before beginning filming, allowing ourselves to rule out areas that would be unsuitable for our world. We were more efficient in filming this way and proceeded onto editing the videos quickly.
In charge of creative and narrative direction, I advised my partner on what to edit for our futuristic, spaceship-like world. We were, however, overly ambitious with our goals, especially considering both of us were unfamiliar with Adobe Aftereffects. We consequently simplified our vision to ensure it was achievable for our skillset, while still establishing our intended immersion in the world we created. As my partner edited, I sketched out a preliminary storyboard to plan exactly how each scene would flow from one to the next for the interactive aspect. I then worked off this storyboard to connect the various scenes on Eevo. However, another issue arose here – we’d both assumed that text could be written directly on scenes in Eevo, but after emailing our tutor to check, I discovered this wasn’t the case. Running out of time, and without Aftereffects on my laptop, I created a slideshow of each scene with the added text, then screen-recorded this to use in Eevo. I understood the brief was to edit in Aftereffects, but this was the most simple and efficient fix for our untimely complication. I feel this demonstrates my ability to overcome unplanned obstacles using the resources at my disposal in an effective way.
Overall, I am content with the final 360 film, but perhaps given more time and experience with Aftereffects and Eevo, the quality of the editing and immersion of the film could’ve been much greater. Though I enjoy narrative direction, for similar future projects, I’d like to tackle the more technical aspects that I am less experienced with to further my knowledge and skills.
Our second artefact was the virtual world-building project. I was both excited and nervous about this: excited to create a virtual Singapore and experience my hometown in a new way, and nervous about using an unfamiliar software, Unity. Despite my reservations, I found this project to be the most enjoyable for creativity and technicality simultaneously; Unity was generally easy to use and understand.
Once again, under our time constraints and my lacking skills in polygon modelling, it didn’t seem feasible to model my Singapore landmarks in Maya. Instead, I opted to download my landmark assets online and edit them in Unity to make them my own.
I faced a few obstacles with this software, despite enjoying it, but each issue was easily remedied with help from my classmates with prior Unity experience, or our tutor. One error now burned in my memory, never to be repeated, is ensuring to save my Unity file as a “project” rather than a “scene”. I mistakenly saved my work as a scene file, and when subsequently loading it, found that the game mode no longer functioned. After much research, fiddling, and assistance from my classmates and tutor, we discovered that the issue was due to the main camera and third-person walker camera having different displays after loading it as a scene. From then on and forevermore, I will ensure to save my Unity work as a project file.
I completed this project quickly and efficiently despite the setbacks and enjoyed doing so. Experiencing childhood landmarks I hold dear in a completely different manner, and being able to share these intimate memories with others, made this project incredibly special. However, if I were to redo it, I would dedicate more time to finer details and the realism of the world, as I feel it would be more immersive that way.
Our final artefact was creating a virtual world avatar in Spark AR. Starting this project behind everyone else due to illness increased pressure, but after browsing through Lenslist for filter inspiration, all those concerns seemed to dissipate. They were instead replaced with an excitement to create something unique and beautiful.
A holographic/iridescent face was the inception of my filter-to-be; it portrayed ethereality and otherwordly beauty. I desired to build upon the holographic face with a mermaid-esque crown to further emanate this impression.
I encountered slight hiccups while constructing my filter, the most prevalent of which was not knowing how to create the holographic face appearance. I engaged my tutor’s support on this, and he suggested I use a variety of different-coloured lighting to create this effect. Gratefully, he was entirely correct; I used this method to create a multicoloured, iridescent facial effect that matched the user’s head movements. He also advised me on how to fit my crown asset perfectly onto the user’s head, as I struggled with this. He suggested I cut the back half off the crown model in Blender, giving the illusion of the crown fitting around one’s head.
Because of his assistance, I was able to create my filter exactly as intended, exuding a surrealistic, mystical beauty. Additionally, I now have a barrage of knowledge on producing certain effects for a filter on Spark AR. I plan to practice these newfound skills and create more filters on Spark AR, as I uncovered such a sense of accomplishment when my filter was functional and published.
Here is the link to the presentation of my final artefacts, and the process of creating each of them:
https://docs.google.com/presentation/d/1MJa0tsLjw1tOQHVSbUZovtfgZiRhpL2b7UMAX35iztM/edit#slide=id.p
In week 6, all three years of our course took a trip to York for the Aesthetica Film Festival. Unfortunately, I fell ill the week before (a day before my birthday – truly tragic) and so was unable to go. Being as ill as I was for those two weeks, I missed week 7 of lessons as well. However, I talked to my classmates and they told me of a new project they covered during week 7 which I intend to catch up on this week (week 8). My plan for this week is to continue working on the Unity project, the 360 film, and the new Spark AR project to ensure they are all up to standard before the submission deadline at the end of this month. Due to the lack of time left (2 weeks), I don’t think they’ll be done to the quality that I would ideally like, especially as I’m a perfectionist. However, I will do my best to manoeuvre this new software and get my projects up to a good standard.
During the first week of lectures, we covered a lot of theory about the history of animation and film, as well as information on virtual reality (VR), augmented reality (AR) and mixed reality (MR). It was definitely interesting to learn how ideas from society back in the early 1900s, for example, have come to fruition now.
We discussed our upcoming assessments and what they would entail, including the software needed – such as Maya and Unity. I’m a little nervous about using these softwares as from what I’ve heard they can be quite complicated and frustrating, but I’m also excited to be able to freely create in 3D, and eventually be able to experience in VR a project I constructed entirely myself. I imagine it’s quite thrilling and surreal.
Our nearest upcoming assessment is to create a 360 film. I’ve used GoPros before to create my own little holiday vlogs, but I’ve never tried making a 360 video so I’m quite intrigued to see how they’re made, how to edit and whatnot.
Our second assessment is making a fantasy world in Maya to then view in VR. I’m extremely excited about the creative freedom we have for this project, and we started working on our moodboards. My initial idea is to create a post-apocalyptic dystopian fantasy world, where the viewer can ponder for themselves what happened to the world to send it to this post-apocalyptic state, and what kind of society lives there now. I detailed some basic ideas in a moodboard slideshow. The link to the slideshow is below, which contains more detailed descriptions of each concept art – why I chose them, and how they inspire me.
My moodboard slideshow: https://docs.google.com/presentation/d/1nogYN4W9rP4hoJcKVxW4ZeYexwz1h-w0kb5qtL7uBhE/edit?pli=1#slide=id.g15bbe60f7b5_0_107