Categories
Advance and Experimental VFX Animation and Techniques Collaborative Projects MA Visual Effects Term Three

ARMAMENTUM – WEEK 01

This project is gonna be a collaboration with Gherardo, I had an idea and ask him if he wanted to collaborate with me.

The main idea is to record ourselves in front of a green screen and we would also replace our forearm with a robotic/futuristic arm. This project will have different techniques, and that’s what we wanted to do, a short video where we could show our work with a lot of techniques. Some of them are, green screen extraction, rotoscope, modelling, texturing, lighting and more, all of them combined to produce a nice shot for our reels.

For this project, we ask Dom to help us since he is the one that showed us how to use 3DE4 for the tracking process. He was happy to help and overview each one of the steps for this project.

The basic plan is:

  • STEP 1: CAMERA MOVEMENT
  • STEP 2: BACKGROUND/ENVIRONMENT
  • STEP 3: DIFFERENT IDEAS EACH FOR THE BACKGROUND
  • STEP 4: CHARACTER ACTING
  • STEP 5: ANIMATIC IN 3D

When we are happy of every step we can start shooting the scene.

So the first step was the decision of the camera movement for the shot. For this, we created using Maya camera setups with different ideas.

After looking at them with Dom, we selected both cameras that we like the most and combine them together. This is the result of this combination.

First week of the project is quite simple and short, but is the first step and as important of the rest, this is the base of the project. I’m sure after a few weeks when we decide what background we use and other parts of the project, this could change just a bit, but the basic idea of this project is here now, and its a good start.

After each step, we have a meeting with Dom to supervise the work and give us the ok and more ideas on how to proceed. Now, as the plan is, the next step we are to work on the background style. We will see more on to the next post, but the basic idea we have is a futuristic/cyberpunk scene.

Categories
Collaborative Unit MA Visual Effects Term Two

RESEARCHER COLLAB

For this part of the course, we had to collaborate with the neighbor course, Computer Animation. I went to padlet where all the people were posting their projects idea and I saw this one called Researcher Collab. It was sci-fi so it immediately got my attention.

I liked the idea behind it and they needed VFX people for some cool effects and lighting. I wanted to try and work with them.

Here is the animatic of the project.

We created a discord channel for better communication. And we usually had a meeting every week to see how we were doing. First, we discussed the storyboard that was already created and if it needed any changes.

We also created a OneDrive folder for easy sharing files. So after we organized our self we started working.

My part on this project is to create some VFX and also set up the lighting of the scenes.

For the VFX part I decided to work with Houdini since we are learning at the moment and I could have the support of the teacher, and also I really like the program and wanted to learn more.

This is the general overview of the node editor in Houdini.

The general workflow is importing the object, scatter to points and then with this points project them on to the floor.

This points projectets are the ones that are gonna generate the simulation of the smoke/dust.

This is the first version and its only smoke, but for previz its more than fine, the idea would be to have smoke and also dust. We learn how to do that on week 6 and 7 of the Houdini class.

After that we create the density and with the pyro solver generate the smoke, touching the right attributes to make the simulation I want to do.

In the end, converting to vdb so I can Import it to Maya where we are gonna render the scene.

Here we can see a bit more of the process:

Importing the object and unpacking it.

After that we delete some of the faces that we don’t want or need to be projecting the points.

After that we scatter to create points and create a vop attribute that we are gonna use for the expanding of the points:

Here we can see how the vop attribute works. The bigger the attribute the closer are the points, and when we reduce the value we can see that the points are spreading.

After this we create density and a distance attribute where the farther away the ship is from the floor, the less points is gonna project.

After this is just a manner of tweaking the settings on the pyro solver to create the desired simulation. And this is the result I got:

After this, the process is to export it to Maya and then work from there, with the shaders for the render. This is a rough render of the result in Maya. The shading is not the best, but I was having issues with it that only it looks black.

Once we are finish with the work for previz, is time to render. We had some issues with the render farm at the University. All the shots that had VFX in it were crashing all the time. We reach the conclusion that the problem was because for the VFX we used Maya 2020 and its more buggy than 2018 the one that is in the computers at Uni.

And we render out the first Act without any of the VFX, here you can see the result:

After the problems we had with the renderfarm we thought that we would no be able to render the shots with the VFX, but our team work really hard to solve it and we end up with good renders to show. And with the VFX and all the sound desing, the result is this:

As you can see there are some parts that are not rendered in Maya, those are Act 02 and Act 03 that we did not have time to finish. But we knew it was a long project so we focused only on the first act to do it properly. We didn’t want to rush it and have a bad result in the end.

Categories
Advance and Experimental VFX Animation and Techniques MA Visual Effects Mehdi Workshop Term Two

HOUDINI – WEEK 06

Week six started with a small practice to make something cool. From a test geometry we scatter points to do an effect like if the geo is vanishing.

With the vop attribute we are able to randomize witch points are being affected by the simulation and also its used to make the geometry disappear when is very small, so it would stop simulating on those particles that we can not see.

Then we create as always the dopnet for the solver. Adding some wind resistance and also modifying the direction of the wind. In this case, we make them go up, against gravity.

Here is the final simulation:

After this small simulation, we learn how to make geometry follow the path of a smoke simulation. We create a smoke simulation like we did previously on different occasions. In this occasion, we create a null at the end of the smoke sim called VOL ADVECT that we are gonna use to make the geometry follow the same sim.

After this, we proceed to create a dopnet for the particles that will follow the sim. And we connected to the VOL ADVECT with the node popadvecbyvolumes. This connects it to the previous volume simulation.

After this, we just have to create a sphere and with copytopoints we replace the particles with spheres and then merge it with the previous sim, and we have the spheres following the smoke sim.

Categories
MA Visual Effects Term Two

SHOWREEL | TERM TWO

SHOWREEL

INDIVIDUAL CLIPS

3D PROJECTION – CLEAN PLATE

3D PROJECTION – 3D MODEL INTEGRATION

BEAUTY PRACTICE

MARKERS REMOVAL

GREEN SCREEN 01

GREEN SCREEN 02

Categories
Collaborative Unit MA Visual Effects Term Two

MARKERS REMOVAL

This practice is part of the smart vectors practice that we also use for beauty practice. In this case we had to remove the markers. This can be achieve by normal tracking each on of the markers but that would require more time and work. With this technique we only have to generate the smart vectors that will work as a track and with that we can do one patch or the necessaries patches for all the markers together.

First I denoised and created the smartvectors.

Then I created the patches necessaries throw the video, I created more than one because the light was changing a lot.

Then I use the smartvectors to track the clip and make the patch to be in place at all time.

After that, also using the vectors created, I’m able to pass some of the motion blurs that should be created by the movement of the clip.

After that, I premultiply each patch and merge it with the original video adding some grain that we lost with the denoised.

And this is the result.

Categories
Collaborative Unit MA Visual Effects Term Two

GREEN SCREEN EXTRACTION | ALPHA TEST 02

Another green screen practice. This has a better green screen but also is a bit more difficult because of the fur and white contrast.

First I denoised the clip as always and then decided to use IBK techniques to see how it work.

This is the alpha I got.

Transparency and luminosity work for integration with the clip/image I wanted to add to the background.

And here we can see it works just fine, looks promising for now.

But when I added the background I found out that the alpha had some issues, some parts were not completely white. That is because I used only one treatment for the alpha when the left part of the green screen has some shadows that made the alpha have some transparency issues.

Here we can see the issue.

When that happen I had to combine different alphas from different parts of the body. I made an alpha for the right part of the body where the green screen is more bright and an alpha matte for the left part because it was darker.

I also did a separate alpha for the ears because it was a harder edge in comparison with the fur body.

I combined the different alphas in to one, and also did some luminosity treatment.

And this is the result of the integration with the background.

Categories
Collaborative Unit MA Visual Effects Term Two

GREEN SCREEN EXTRACTION | ALPHA TEST 01

Green screen can be a long process and there are different techniques that can be applied to do a good green screen extraction. Here is the original plate I choose.

I decided to use keylight node for this practice. The first keylight is to create an interior alpha matte, strong completely white so we don’t have any unwanted transparencies. Then we do a core matte, more like a general matte that we keymix with a softer matte for the hair and parts that are more delicate.

For the keylight we select the green part of the screen that we want to remove and we treat it a bit to get a better result.

This particular clip was a bit tricky since the green screen was not very good. It is important to have a good clip well lit and recorded for the green screen to work at the best.

Here we see the result of the alpha. And the different alphas combined in to one.

After that, we can integrate it with a different background. I also treated the clip so we don’t lose any luminosity and it can be integrated better.

This is the end result:

Categories
Collaborative Unit MA Visual Effects Term Two

CLEAN PLATES – 3D PROJECTION

In this practice we will learn how to do a clean plate from a video, we start with a simple one that later we will use for a collaborative project.

First we track the plate in order to create cards to use for the patches.

Here we can see that the track works and we can proceed to remove the few markers that we have on the wall and floors.

We proceed to create the patches:

The first time I tried I had some issues with the markers on the far end of the plate. This is the result, I could not remove them.

I kept trying until I found the solution, I had the card in the wrong place even tho it looks it was in the right position. After I place the card in a better position I did not have the problem anymore and as we can see in the next video, the markers are removed and we have a clean plate to work from now.