Categories
Collaborative Unit MA Visual Effects Term Two

RESEARCHER COLLAB

For this part of the course, we had to collaborate with the neighbor course, Computer Animation. I went to padlet where all the people were posting their projects idea and I saw this one called Researcher Collab. It was sci-fi so it immediately got my attention.

I liked the idea behind it and they needed VFX people for some cool effects and lighting. I wanted to try and work with them.

Here is the animatic of the project.

We created a discord channel for better communication. And we usually had a meeting every week to see how we were doing. First, we discussed the storyboard that was already created and if it needed any changes.

We also created a OneDrive folder for easy sharing files. So after we organized our self we started working.

My part on this project is to create some VFX and also set up the lighting of the scenes.

For the VFX part I decided to work with Houdini since we are learning at the moment and I could have the support of the teacher, and also I really like the program and wanted to learn more.

This is the general overview of the node editor in Houdini.

The general workflow is importing the object, scatter to points and then with this points project them on to the floor.

This points projectets are the ones that are gonna generate the simulation of the smoke/dust.

This is the first version and its only smoke, but for previz its more than fine, the idea would be to have smoke and also dust. We learn how to do that on week 6 and 7 of the Houdini class.

After that we create the density and with the pyro solver generate the smoke, touching the right attributes to make the simulation I want to do.

In the end, converting to vdb so I can Import it to Maya where we are gonna render the scene.

Here we can see a bit more of the process:

Importing the object and unpacking it.

After that we delete some of the faces that we don’t want or need to be projecting the points.

After that we scatter to create points and create a vop attribute that we are gonna use for the expanding of the points:

Here we can see how the vop attribute works. The bigger the attribute the closer are the points, and when we reduce the value we can see that the points are spreading.

After this we create density and a distance attribute where the farther away the ship is from the floor, the less points is gonna project.

After this is just a manner of tweaking the settings on the pyro solver to create the desired simulation. And this is the result I got:

After this, the process is to export it to Maya and then work from there, with the shaders for the render. This is a rough render of the result in Maya. The shading is not the best, but I was having issues with it that only it looks black.

Once we are finish with the work for previz, is time to render. We had some issues with the render farm at the University. All the shots that had VFX in it were crashing all the time. We reach the conclusion that the problem was because for the VFX we used Maya 2020 and its more buggy than 2018 the one that is in the computers at Uni.

And we render out the first Act without any of the VFX, here you can see the result:

After the problems we had with the renderfarm we thought that we would no be able to render the shots with the VFX, but our team work really hard to solve it and we end up with good renders to show. And with the VFX and all the sound desing, the result is this:

As you can see there are some parts that are not rendered in Maya, those are Act 02 and Act 03 that we did not have time to finish. But we knew it was a long project so we focused only on the first act to do it properly. We didn’t want to rush it and have a bad result in the end.

Categories
Advance and Experimental VFX Animation and Techniques MA Visual Effects Mehdi Workshop Term Two

HOUDINI – WEEK 06

Week six started with a small practice to make something cool. From a test geometry we scatter points to do an effect like if the geo is vanishing.

With the vop attribute we are able to randomize witch points are being affected by the simulation and also its used to make the geometry disappear when is very small, so it would stop simulating on those particles that we can not see.

Then we create as always the dopnet for the solver. Adding some wind resistance and also modifying the direction of the wind. In this case, we make them go up, against gravity.

Here is the final simulation:

After this small simulation, we learn how to make geometry follow the path of a smoke simulation. We create a smoke simulation like we did previously on different occasions. In this occasion, we create a null at the end of the smoke sim called VOL ADVECT that we are gonna use to make the geometry follow the same sim.

After this, we proceed to create a dopnet for the particles that will follow the sim. And we connected to the VOL ADVECT with the node popadvecbyvolumes. This connects it to the previous volume simulation.

After this, we just have to create a sphere and with copytopoints we replace the particles with spheres and then merge it with the previous sim, and we have the spheres following the smoke sim.

Categories
MA Visual Effects Term Two

SHOWREEL | TERM TWO

SHOWREEL

INDIVIDUAL CLIPS

3D PROJECTION – CLEAN PLATE

3D PROJECTION – 3D MODEL INTEGRATION

BEAUTY PRACTICE

MARKERS REMOVAL

GREEN SCREEN 01

GREEN SCREEN 02

Categories
Collaborative Unit MA Visual Effects Term Two

MARKERS REMOVAL

This practice is part of the smart vectors practice that we also use for beauty practice. In this case we had to remove the markers. This can be achieve by normal tracking each on of the markers but that would require more time and work. With this technique we only have to generate the smart vectors that will work as a track and with that we can do one patch or the necessaries patches for all the markers together.

First I denoised and created the smartvectors.

Then I created the patches necessaries throw the video, I created more than one because the light was changing a lot.

Then I use the smartvectors to track the clip and make the patch to be in place at all time.

After that, also using the vectors created, I’m able to pass some of the motion blurs that should be created by the movement of the clip.

After that, I premultiply each patch and merge it with the original video adding some grain that we lost with the denoised.

And this is the result.

Categories
Collaborative Unit MA Visual Effects Term Two

GREEN SCREEN EXTRACTION | ALPHA TEST 02

Another green screen practice. This has a better green screen but also is a bit more difficult because of the fur and white contrast.

First I denoised the clip as always and then decided to use IBK techniques to see how it work.

This is the alpha I got.

Transparency and luminosity work for integration with the clip/image I wanted to add to the background.

And here we can see it works just fine, looks promising for now.

But when I added the background I found out that the alpha had some issues, some parts were not completely white. That is because I used only one treatment for the alpha when the left part of the green screen has some shadows that made the alpha have some transparency issues.

Here we can see the issue.

When that happen I had to combine different alphas from different parts of the body. I made an alpha for the right part of the body where the green screen is more bright and an alpha matte for the left part because it was darker.

I also did a separate alpha for the ears because it was a harder edge in comparison with the fur body.

I combined the different alphas in to one, and also did some luminosity treatment.

And this is the result of the integration with the background.

Categories
Collaborative Unit MA Visual Effects Term Two

GREEN SCREEN EXTRACTION | ALPHA TEST 01

Green screen can be a long process and there are different techniques that can be applied to do a good green screen extraction. Here is the original plate I choose.

I decided to use keylight node for this practice. The first keylight is to create an interior alpha matte, strong completely white so we don’t have any unwanted transparencies. Then we do a core matte, more like a general matte that we keymix with a softer matte for the hair and parts that are more delicate.

For the keylight we select the green part of the screen that we want to remove and we treat it a bit to get a better result.

This particular clip was a bit tricky since the green screen was not very good. It is important to have a good clip well lit and recorded for the green screen to work at the best.

Here we see the result of the alpha. And the different alphas combined in to one.

After that, we can integrate it with a different background. I also treated the clip so we don’t lose any luminosity and it can be integrated better.

This is the end result:

Categories
Collaborative Unit MA Visual Effects Term Two

CLEAN PLATES – 3D PROJECTION

In this practice we will learn how to do a clean plate from a video, we start with a simple one that later we will use for a collaborative project.

First we track the plate in order to create cards to use for the patches.

Here we can see that the track works and we can proceed to remove the few markers that we have on the wall and floors.

We proceed to create the patches:

The first time I tried I had some issues with the markers on the far end of the plate. This is the result, I could not remove them.

I kept trying until I found the solution, I had the card in the wrong place even tho it looks it was in the right position. After I place the card in a better position I did not have the problem anymore and as we can see in the next video, the markers are removed and we have a clean plate to work from now.

Categories
Advance and Experimental VFX Animation and Techniques MA Visual Effects Mehdi Workshop Term Two

HOUDINI – WEEK 05

This week is all about smoke and fire. We learn how to create a smoke simulation, like a fireplace.

We start from a circel and scatter to create points with then we create the attribute density and we visualize it.

After this we create a dopnet for the solver. In this case, we use pyrosolver which will help to create a better simulation in this case. We add as many nodes as we need, like gaswind, turbulence, etc. To create the movement of the smoke that we want.

Once we are okay with the simulation we can dopimport the solver, and with the volumevisualization we gave emission to the smoke with the attribute temperature which will create the fire. We modify as we need and then we can file cache the simulation and either do a flipbook or render it out.

Here you can see what it would look like in the final render:

This is a flipbook of the simulation:

Categories
Advance and Experimental VFX Animation and Techniques MA Visual Effects Mehdi Workshop Term Two

HOUDINI – WEEK 04

Week 04 is dedicated to render with arnold in Houdini. We used the simulation we did of the test geometry smashing the hammer on the floor.

Arnold in Houdini works as it works with Maya, you just need to make sure that the materials you apply to the objects are arnold materials.

For the particles, we create a material that over time changes color and becomes more transparent. This is the result of the render.

Categories
Advance and Experimental VFX Animation and Techniques MA Visual Effects Mehdi Workshop Term Two

HOUDINI – WEEK 03

In this week we focused on procedural modeling. The idea is to build a wood cabin to later on destroy it with simulations.

We started with a box to do the basic shape of the cabin. Then we deleted all the faces but the ones on the bottom and top to extrude them and like this we create the floor and ceiling. At the end I merge it with some supporting base.

We do the same for the wood planks that are gonna build the walls of the cabin. We create a simple box that we transform and position, then with a copy node we copy as many times we need, and with the node transform we put it in place.

Then with we create the windows and door and with a boolean node, we subtract it from the geo that we already have to create the doors and windows.

Then we merge it all together and create a simple geomatry to fill the wholes created for the door and windows.

And with the node rbdmaterialfracture we prepare the geometry to be destroyed. We can see how is it gonna look with an exploded view.

We create a group of the selection called inactive that we are gonna use to set those primitive selected to not be destroyed by the simulation.

Before proceeding with the simulation, we separate the differents materials so they destruct in different ways since we have wood, concrete, and glass.

And after that, this is the end result.