Categories
360 Filmmaking Augmented Reality Introduction to Virtual Reality Virtual World Building

Critical Appraisal

Developing these three artefacts, I have efficiently worked under strict time constraints, discovering innovative methods or enlisting aid to rectify any issues that came my way. Each project bore numerous complications, but my capacity to devise solutions, or learn from others to resolve these challenges, has instilled confidence in my capabilities and newfound skills for future projects. I will now divulge how I came to this realization and the developmental procedures of each project.

The first artefact was the 360 film project. Conceptually, I knew of its existence, but I had never attempted to create a 360 film of my own before. Thankfully, this project was done in pairs, and I partnered with a classmate who had significantly more experience with computer softwares and editing than I did. Under our time constraints, we agreed it was more appropriate to work mostly within our existing individual strengths. Thus, we concurred that she would handle most of the technical editing work in Adobe Aftereffects, and I would manage the narrative and creative direction, including the interactive element of the film editing in Eevo.
When gathering ideas for our narrative, we heeded the assignment brief and decided to base our plot around a futuristic, post-apocalyptic human civilization. I drew inspiration from multiple video games to aid in devising lore for our 360 film world. We designed a foundational narrative before beginning filming, allowing ourselves to rule out areas that would be unsuitable for our world. We were more efficient in filming this way and proceeded onto editing the videos quickly.
In charge of creative and narrative direction, I advised my partner on what to edit for our futuristic, spaceship-like world. We were, however, overly ambitious with our goals, especially considering both of us were unfamiliar with Adobe Aftereffects. We consequently simplified our vision to ensure it was achievable for our skillset, while still establishing our intended immersion in the world we created. As my partner edited, I sketched out a preliminary storyboard to plan exactly how each scene would flow from one to the next for the interactive aspect. I then worked off this storyboard to connect the various scenes on Eevo. However, another issue arose here – we’d both assumed that text could be written directly on scenes in Eevo, but after emailing our tutor to check, I discovered this wasn’t the case. Running out of time, and without Aftereffects on my laptop, I created a slideshow of each scene with the added text, then screen-recorded this to use in Eevo. I understood the brief was to edit in Aftereffects, but this was the most simple and efficient fix for our untimely complication. I feel this demonstrates my ability to overcome unplanned obstacles using the resources at my disposal in an effective way.
Overall, I am content with the final 360 film, but perhaps given more time and experience with Aftereffects and Eevo, the quality of the editing and immersion of the film could’ve been much greater. Though I enjoy narrative direction, for similar future projects, I’d like to tackle the more technical aspects that I am less experienced with to further my knowledge and skills.

Our second artefact was the virtual world-building project. I was both excited and nervous about this: excited to create a virtual Singapore and experience my hometown in a new way, and nervous about using an unfamiliar software, Unity. Despite my reservations, I found this project to be the most enjoyable for creativity and technicality simultaneously; Unity was generally easy to use and understand.
Once again, under our time constraints and my lacking skills in polygon modelling, it didn’t seem feasible to model my Singapore landmarks in Maya. Instead, I opted to download my landmark assets online and edit them in Unity to make them my own.
I faced a few obstacles with this software, despite enjoying it, but each issue was easily remedied with help from my classmates with prior Unity experience, or our tutor. One error now burned in my memory, never to be repeated, is ensuring to save my Unity file as a “project” rather than a “scene”. I mistakenly saved my work as a scene file, and when subsequently loading it, found that the game mode no longer functioned. After much research, fiddling, and assistance from my classmates and tutor, we discovered that the issue was due to the main camera and third-person walker camera having different displays after loading it as a scene. From then on and forevermore, I will ensure to save my Unity work as a project file.
I completed this project quickly and efficiently despite the setbacks and enjoyed doing so. Experiencing childhood landmarks I hold dear in a completely different manner, and being able to share these intimate memories with others, made this project incredibly special. However, if I were to redo it, I would dedicate more time to finer details and the realism of the world, as I feel it would be more immersive that way.

Our final artefact was creating a virtual world avatar in Spark AR. Starting this project behind everyone else due to illness increased pressure, but after browsing through Lenslist for filter inspiration, all those concerns seemed to dissipate. They were instead replaced with an excitement to create something unique and beautiful.
A holographic/iridescent face was the inception of my filter-to-be; it portrayed ethereality and otherwordly beauty. I desired to build upon the holographic face with a mermaid-esque crown to further emanate this impression.
I encountered slight hiccups while constructing my filter, the most prevalent of which was not knowing how to create the holographic face appearance. I engaged my tutor’s support on this, and he suggested I use a variety of different-coloured lighting to create this effect. Gratefully, he was entirely correct; I used this method to create a multicoloured, iridescent facial effect that matched the user’s head movements. He also advised me on how to fit my crown asset perfectly onto the user’s head, as I struggled with this. He suggested I cut the back half off the crown model in Blender, giving the illusion of the crown fitting around one’s head.
Because of his assistance, I was able to create my filter exactly as intended, exuding a surrealistic, mystical beauty. Additionally, I now have a barrage of knowledge on producing certain effects for a filter on Spark AR. I plan to practice these newfound skills and create more filters on Spark AR, as I uncovered such a sense of accomplishment when my filter was functional and published.

Categories
360 Filmmaking Augmented Reality Introduction to Virtual Reality Virtual World Building

Final Artefacts Presentation

Here is the link to the presentation of my final artefacts, and the process of creating each of them:

https://docs.google.com/presentation/d/1MJa0tsLjw1tOQHVSbUZovtfgZiRhpL2b7UMAX35iztM/edit#slide=id.p

Categories
360 Filmmaking

Week 10: 360 Film Finalization

I began editing the 360 film scenes on Eevo this weekend but was unsure of how to connect various scenes together to create the interactive element. I couldn’t find any instructions or guides online, so emailed my tutor to ask. He provided me with various guidance links from the Eevo website and also informed me that text cannot be added to videos on Eevo directly, and that they must be added on Aftereffects before being placed into Eevo. This proved to be a problem as my partner and I had assumed we could place large bodies of text on various scenes to provide the player with information on the world, a crucial aspect of our narrative. Additionally, I don’t have Aftereffects on my laptop, and waiting until Monday’s lecture seemed too tight as the deadline for the final film is on Wednesday. As such, I decided to create a Google slideshow to add the text and animate. I then screen-recorded this on Quicktime Player and cut the films, placing them in Eevo afterwards to ensure our choice option scenes would be completed. Despite this, some final scenes still need to be further edited on Aftereffects during Monday’s lecture as the informative text is necessary to the scenes to provide the player with context, and this can’t be added any other way.

Editing on Eevo
The text slides I screen-recorded

Here is the link for the slideshow: https://docs.google.com/presentation/d/1qhULiTYxNXaurGzyoRVUjVEuCl5ejNUpFAqfeEjbmOs/edit#slide=id.p

We might be a little tight for time, especially with the other projects that need to be done by Wednesday as well, but I’m confident I will get this done. The process has not gone as smoothly as planned, but things like this inevitably happen when working on projects, and I’m proud of my ability to recognize an innovative solution to the problem at hand and fix it, rather than stress and worry.

My partner and I will add the text to the filmed 360 scenes on Monday, and then all that needs to be done is to connect the interactive elements on Eevo with the newly edited scenes. In the meantime, I will work to connect the rest of the scenes and ensure the interactive element is functioning.

Categories
360 Filmmaking

Week 9: 360 Film Production

My partner and I began editing the 360 videos this week. We had previously agreed that she would take care of the technical aspects of the editing in Aftereffects, and I would lead in the creative and narrative direction of the project in Eevo. We decided to do it this way to optimize our skills and resources to ensure the project was done effectively and on time. She has more experience with editing softwares, and in the same way, I have more experience in creative and narrative direction.

Perhaps it may have been more effective for learning to grow our skillset if we were to take charge of the areas we were less familiar with, however with the time constraints placed upon us, we felt it was best to proceed in a manner in which we knew would be most efficient.

We both sat together as my partner worked on the editing, and I provided suggestions on the kinds of edits we could make to each scene. Our initial ideas to edit in holograms and neon lighting were admittedly a bit far fetched considering neither of us were familiar with Aftereffects, but we were able to add in bright blue lighting instead of the classic white or orange lighting on campus, as well as text on some clips. We decided to focus more on the narrative aspect of the film as trying to edit in realistic effects proved to be quite difficult.

Meanwhile, I sketched out a storyboard for how each scene will connect to one another, and therefore the kind of plot this will lead. We had previously decided to operate the interactive aspect of the plot in such a way that no matter what the player chose, they would receive the same information, despite arriving at different scenes. This way, the intention of the plot is achieved, the player receiving all the information, while still under the illusion that they have choices in the story that change the outcome. This allows the film to remain immersive and interactive while still delivering out intended plot effectively.

My sketched out storyboard/film plan

My partner finished the editing on Aftereffects and now it is up to me to create the interactive narrative element of the film and ensure it flows smoothly. I have never used Eevo before so I’m a little nervous, but due to the restricted deadline, I’m determined to get this done to a good standard by any means necessary.

Categories
Augmented Reality

AR Artefact Biography

I wanted to create a virtual world avatar that emanated ethereal, mystical beauty. As such, there is this iridescent, fantasy-like glow to the face, with a crown that matches this iridescence as the user’s face moves. As the light catches the user’s face, the colours of the holographic glow merge into one another, creating a glossy, prismatic effect. I believe this look exudes a paradisiacal feel; a delicate beauty that is somewhat otherworldly, and yet, because human features aren’t distorted in the filter, remaining natural. I wanted to keep these mundane characteristics, but warp them slightly to create this surrealistic version of myself as my virtual world avatar. I felt this would elevate the user’s experience with the filter to portray an alternate, distorted version of humans, or perhaps an insight into what the existence of an otherwordly being may look like.

Changing certain aspects of myself was always my intention with the filter from the beginning, which may be due to modern humans’ innate desire to alter their physical attributes to an extent, likely consequential to unrealistic societal beauty standards. By harnessing these insecurities and creating a virtual world avatar version of myself that is magical and transcendental, perhaps subconsciously freed me of this burden. I fabricated a heightened version of myself that is both ordinary and extraordinary; temporal and fey.

The inspiration behind the holographic quality of the filter was to create a mermadic feature to my virtual world avatar, as if I were a mermaid princess (hence the crown). The crown is large in scale and abstractly shaped, reiterating this fantasy element to the filter – an unnatural design that wouldn’t exist in actuality. Combining the unorthodox crown and holographic lighting, I feel I have successfully illustrated a mystical version of myself as my virtual world avatar.

Categories
Virtual World Building

Week 9: Virtual Singapore Completed

This week, I added the finishing touches to my virtual tour of Singapore in Unity. I had completed most of it last week but still needed to add little details such as finishing the roads and sidewalks, as well as placing all the lampposts and bushes. Though this didn’t take much time to complete, I suddenly had a problem I had never encountered before – my whole scene wouldn’t appear in game mode, despite the first-person prefab and camera being placed correctly. As such, I sought help from the tutor on how to fix this issue. Luckily, after much research, trial and error, and even attempted help from my fellow classmates, we managed to fix the problem. I had previously saved my work as a scene rather than saving the entire project itself, and as a consequence, when loading it up in Unity, the camera displays glitched. The camera display for my first-person walker was on Display 1, whereas the main camera was on Display 2, meaning they were attempting to show two different things at the same time. Changing both camera displays to Display 2 thankfully solved the problem, but I was definitely panicking for a second there!

After I had fully completed my world, I ensured to save the project file as a whole, rather than just a scene file. I will also do so in every future session where I use Unity to prevent this issue from occurring again.

I’m very proud of myself for completing this project so quickly and with minimal issues. Despite it being tedious at times with various little editing nuances, I generally really enjoyed using Unity and the freedom it allowed me to display my hometown in a unique way for others to experience. I’m incredibly looking forward to seeing this mini version of Singapore that I’ve built in VR!

Completed virtual Singapore overview
Close-up of completed streets/roads and lampposts
Close-up of completed streets/roads and lampposts
Categories
Augmented Reality

Week 9: Creating My Filter

This week I worked on bringing my vision for my Spark AR filter to life. Still new to the software, I had to ask my tutor for advice on certain aspects I struggled with. For example, I wanted my filter to have a crown tracking the user’s head, but found that it was levitating in front of my head instead of wrapping around it. I asked my tutor how to rectify this issue and he suggested that I cut the back portion of the crown off to give the illusion of it wrapping around the back of the user’s head. I then used the software “Blender” to do so, inputting the new model into Spark AR. As suspected, this fixed the issue and made the crown look as though it was perfectly wrapped around my head.

For the crown to look realistic (to an extent), and as eye-catching as I wanted it to be, I increased its metallicity to the maximum and used a smooth crown texture downloaded from Sketchfab. The model of the crown itself was also downloaded from Sketchfab.

The next step was to create the holographic face effect. I wanted to create a look that would make the user’s face look glowy in a colourful, ethereal way. To do so, I intended to combine pink, green, blue and white lighting effects on the face. I had downloaded textures of these colours but was unsure of how to then place them on the face in the manner I wanted. Once again, I sought help from my tutor and he suggested I use the various lighting effects in Spark AR and then colour them to achieve the look I wanted. As such, I placed 4 different lighting effects towards the face, colouring each of them in green, blue, pink and white, respectively. I then adjusted the strength of the lighting and the pigmentation of the colours to ensure the holographic effect I desired was achieved.

When browsing for ideas for my Spark AR filter, I initially wanted to combine this holographic face effect with a crown, a bleached brow look and some colourful undereye eyeshadow. I decided not to do the more beauty-focused aspects when actually creating the filter as I felt it would look too convoluted and messy. The final filter is much simpler than I originally intended, but because the crown is quite large and draws a lot of focus, I feel the holographic face and the way the colours change as your face turns, ties the look together quite nicely. It also has this wet, ethereal vibe to it – the best way I can describe this is, when using the filter, I feel like an underwater mermaid princess. Though it may seem silly, this sort of mystical beauty was my original intention for the filter, and if I were to have done more, I fear its impact and the energy it conveys would have been completely different.

Happy with the quality and look of the filter, I have just published it to be used on Instagram, Facebook and Messenger – “holographic crown”. I’m excited to share this with my friends and for them to use the filter themselves!

Being someone who has used various filters on Snapchat and Instagram from time to time, it’s felt very surreal to publish one of my own for people all over the world to use, if they choose to do so. I think I may enjoy making and publishing more filters in my free time!

Demo clip for my filter
Categories
Virtual Principles

Week 8: UV Texturing

This week we looked at how to texture models on Maya. I had missed the previous class on colouring models as I was sick, and so there were a few steps that I had to learn before texturing.

I didn’t struggle as much as I thought I would, especially given that I had missed the previous lesson, but there were still some instances where I needed help. Generally, however, I feel that I am now able to texturize and colour basic shapes by myself, which will come in handy when starting to do so on my fantasy world models. I am a little worried about this though, because I will be modelling trees and obscurely shaped homes for my fantasy world, which I fear will be quite challenging in and of itself, but then to take these models apart to be able to colour and texturize effectively will certainly pose an obstacle. Luckily, during classes, I can ask for help from my tutor if I ever need it, as well as other classmates who are more experienced in Maya.

What I found the most challenging from this process was puzzling out which grid (once separated from the original shape) was which side of the shape. We took the shapes apart into smaller, 2D faces, and while this is the most effective way to colour and texturize, it confused me sometimes as I had trouble picturing the grids into the whole shape in my mind. Perhaps I’m not explaining this helpfully, but it felt a little as though I was doing geometry. Math has never been a strong suit of mine so I did find myself getting a little frustrated at times. However, I know with more practice, this process will become much easier, and as I improve, I can move onto more complex shapes to prepare for colouring the texturizing my fantasy world models.

We first learned to colour and texture a basic shape – a cube – and then went onto a slightly more complex shape, a treasure chest. Below are photos of this process in class.

Colouring the cube
Colouring the chest (incomplete as I ran out of time)
Categories
Augmented Reality

Week 8: Spark AR

Today I was introduced to Spark AR, the project we were told about last week that I missed because I was sick. We are meant to create a filter as a virtual avatar of ourselves, and then write a 300-word explanation of why we decided to depict ourselves in such a way.

I learnt some of the basics during class, creating a basic tester filter of glasses and a frame. My tutor also shared a website with me where I could browse for inspiration on the kinds of filters I could create. Through browsing, I’ve found that the holographic filters resonate with me the most, and I think I would like to recreate this for my virtual avatar filter. It emanates a sort of ethereal surrealism, where the natural human features are still intact, and yet somehow warped and different. The various colours used for the holographic elements make it also seem distorted, perhaps, differing from colours that appear naturally on people’s faces.

My eye was also drawn to a beauty filter, where there was a form of undereye eyeshadow, which is abstract, and a bleached brow. Bleached eyebrows are trendy right now, and though I wouldn’t take part in it myself, I have wondered occasionally how I might look with bleached brows. I saw this as an opportunity and thought I could make a holographic face filter, with bleached brows and undereye/cheek colouring. I thought it could perhaps be an insight into what beauty standards may be in the future, such as in a distant, fully-digital era, or a portrayal of some alternate, distorted version of humans.

Holographic/futuristic beauty filter moodboard
The glasses and screen filter I made as a trial
Categories
360 Filmmaking

Week 8: Editing on Aftereffects

Today my partner and I began editing our 360 film clips on Adobe Aftereffects. Having been a while since our initial tutorial on the software, we had both forgotten most of the basics and so did some research and asked for assistance from our tutor to get started. We both soon realized that we had been overly ambitious with how we wanted to edit our clips, as Aftereffects can be quite finicky, and thus unfortunately had to compromise our vision to suit our skillset.

We began with a random clip – one from the canteen – as all the clips will have to be imported individually into Eevo (the website we’re using to edit in the interactive element of the film); it doesn’t really matter which particular order we edit them in.

To create the effect of a futuristic world in which this film takes place, we decided to edit the lighting within the scene. We thought bright, almost neon blue lighting would be fitting for this spaceship canteen, instead of the regular white/orange lighting, and so placed various gradient ellipses on the lights in the clip.

Due to our initial struggle with using Aftereffects, this was unfortunately all we were able to achieve today, but we have set up plans for editing each scene, including the order in which they will appear for the final film. We intend to continue working on this tomorrow, ideally finishing all editing on Aftereffects, so we can begin editing the interactive element next week. This gives us wiggle room in case there may be any last-minute tweaks to be done before the final due date.

Photo of our edited scene with the futuristic blue lighting