/r/NukeVFX
We're here to help with your Nuke problems, critique your pieces, and sometimes provide the sickest Nuke vfx breakdowns.
We're here to help with your Nuke problems, critique your pieces, and sometimes provide the sickest Nuke vfx breakdowns.
NUKE is the industry standard digital compositing software produced and distributed by The Foundry, and used for film and television post-production.
NUKE is available for Microsoft Windows, Mac OS X, and Linux.
NUKE's users include Digital Domain, DreamWorks Animation, Sony Pictures Imageworks, Sony Pictures Animation, Framestore, Weta Digital and Industrial Light & Magic.NUKE has been used on productions such as Avatar, Mr. Nobody, The Curious Case of Benjamin Button, King Kong, Jumper, I, Robot, Resident Evil: Extinction, Tron: Legacy,Alice in Wonderland, Black Swan and The Hobbit.
/r/NukeVFX
Imagine having a time machine and being able to talk to your younger self before starting your career in the VFX industry as a junior compositor. You don't know much about the industry, and even less about the compositor's role. All you know is that you love movies, you're a nerd, and you have a passion for art and creativity.
If you could take your younger self with you on a typical workday, what would you show them about the job of a VFX compositor, both the good and the bad?
How difficult is it? Why?
How creative is it? Why?
How satisfying is it? Why?
... Feel free to suggest any other questions you think might be relevant.
I have two issues here, 1. White border around the car which is reduced using a erode node but gives a black border 2. When applying grade using the alpha image as mask a white border appears
So I got arnold CG renders from 3dsmax and using nuke 14 and I have the following issue : nuke can't see the crypto layers in the EXR.
They are ther, so I can switch from rgba to crypto_material or crypto_object , but the crypto node itself won't see them.
Not sure if this is a nuke or 3dsmax / arnold problem, investigating both ends.
Sorry for the very noob question. Only been using Nuke a couple months and hitting a big wall here!
I'm trying to merge several files into one EXR image using copy and shuffle nodes. Shuffled and copied reflection, specular and grunge successfully, all was going fine until I got to one of my AOVs, shadow. It appears (to me) to be a three channel image, I can view its R G and B channels in the viewer (although each channel is identical). In the Shuffle node, how come I am unable to route the R, G, B pipes in the input layer to the AOVs.shadow Output layer? How can I successfully shuffle and copy this into my main data stream?
Clearly missing something obvious here. Thank you so much for any help!
I don't do a ton of comp and am trying to understand / optimize things a bit more. from what I understand nuke writes out multipart/exr 2.0 by default? or is there anything special you have to do? also, is there a way to check if an exr you have is multipart or not?
Im currently having a little but annoying issue in 15.1 when dragging and dropping files in the nodegraph. No matter where I drop the file it always appears in the MIDDLE OF THE NODEGRAPH and not at the CURSOR POSITION. So on a large project, I have to navigate miles to find the dropped file.
Any Idea?
Hello I need help asap this is supposed to be done tomorrow.
This is my node tree that I created by following a youtube tutorial.
And this is my CG element robot that have been edited into this background plate using Maya and Nuke.
I'm completely new to nuke and i barely understand anything so keep that in mind when helping me. U see the reflection going on the door. This reflection is the shuffle node: specular_Indirect. I want to rotoscope only this reflection so it only lands on the floor where the red rotoscoping is but I dont understand where, or how.
for me it makes sense to connect it in the shuffle node or grade because grade has mask input but it does nothing. I'm completely new to this though so im def missing something.
If the screenshot is blurry somehow, the specular_indirect is the second node in the primary passes backdrop. I probably need to rearrange the tree but I just dont understand how at all. please help me.
Im sorry if this is dumb but I didnt go to VFX school and finding specific answers online is hard. I was wondering what the hell is Nuke for? I understand you can simulate or animate several footages in for example, Maya, C4D or Houdini and bring them together in Nuke. Is that all it is for? Ive seen talk about realistic light, making shots look real in Nuke, but isnt that was renderers are for? I use redshift for my renders is Nuke basically a replacement for renderers? Or do you need to render BEFORE going into Nuke? Then what is the point of Nuke if everything is already rendered?
Basically I dont know where nuke fits in a workflow and why it is needed. I usually just add everything to a scene in C4D and render the whole animation and that is it. Can I just model everything and then animate/light/add materials in Nuke?
As a small studio or even solo nuke user are there any tips or tricks to keep gizmos managed. A workflow to somehow have a couple of machines access the same version of gizmos and plugins so that scripts can be shared, also at some point introduce a render farm into the environment?
Any advice very much welcome.
-Bonus points for keeping it simple!
Hello there, I'm a student learning compositing and Nuke and I have a small problem, the Nuke on my pc doesn't detect anything in the .abc file, and I know the problem isn't coming from the file since the other students with the same file as me doesn't have the problem , and I tried using the file on another pc and it just worked. I tried the classic things (restart Nuke, restart the pc..) but it doesn't work. When I drag and drop my file into Nuke the window appearing is just empty does anyone know what may caused that?
(We work on Nuke14.0v5 at school ) (And excuse my English I'm French ^^')
Hello!
I've watched the movie Spider-Man: Across the Spider-Verse a few times now, and I've been seriously in awe of the 6 main universes they made for that movie-- especially Gwen Stacy's universe, Earth 65:
The paint effect with the brush strokes and everything is beautiful, and I think I've somewhat managed to recreate it Nuke (the same software Spider-Verse used for compositing), here are some examples:
I'm very surprised that I don't see many people talking about remaking this effect- I guess it's kind of specific, but I've looked pretty much everywhere and wasn't able to find too much... So I took it upon myself to do more research, and thankfully, I DID find something. An amazing tool by Perceval Schopp called
"pScatterStrokes" that's posted on Nukepedia. In summary, it runs the position pass through a blinkscript that converts your scene into a point cloud. On each of those points, a brush stroke is added. I used his as a base and made some subtle tweaks. Sounds kinda simple right? Wrong. Getting to the point now, I have 2 things I'm struggling to figure out:
Here is the official menu for Spider-Verse's paint tool. I'd like to recreate this exactly.
Here is the desired result:
Looking at this example, you can see that the strokes stick and wrap around the surface. Although this isn't a video, there is no jitter or sparratic rotations either. In the end, I realised I needed to use triplanar mapping- but the herein lies the other problem... I'm not a coder 😅 The tool was made in blinkscript, and I'd like to implement triplanar mapping, but have no idea where to start.
A friend of mine came up with an idea to implement this, although he isn't a blinkscript coder either. This may be of help though:
The tool is a bit slow- increasing the density ever so slightly crashes Nuke which gets annoying, but is probably the smallest issue right now. I know the creator mentioned he wanted to work with a blinkscript coder to elevate the tool to the next level, but I haven't seen any update on that.
The tool doesn't work too well on flat surfaces, but mainly on curved stuff. Not sure why that is, but I know the creator acknowledged it
Anyways, I'd be super greatful if any of you would know how help me with triplanar mapping. I've been going at this for like months now and although it's evolving, this triplanar thing has really been a roadblock. Sorry for the longggg block of text😭Hope some of you guys can help me out!
What if there is no WHITE in the image? Color matching is the bane of my existence. In addition to answering these questions for me, if you have any tips on simple ways to color match (the gizmo doesn't always work so well) I'm all ears. Thanks in advance :)
Hi, today I was rewatching Fahrenheit 451. Graphically, the film hasn't aged very well, especially for this scene, the Jetpack scene.
In my opinion, the cops are not rotoscoped manually, and I don't think they are in front of a giant screen.
The only plausible effect I could think of was some type of blue screen (as far as I know back in the days blue screens were more doable than green screens).
I'm not an expert but I'm fascinated by these ancient techniques.
Do you have any clue on how was this scene made? Thanks!
I was given a bunch of mxf files with some cube files and kind of abandoned on this project. I know I should deliver to the color team EXRs that are in the same colorspace as they were delivered to me. But I'm not sure the best way for me to be working in Nuke. Should I work in ACES? Should I use a OCIOTranform in my pipe and invert it on the way out? Should I make a VIEWER_INPUT node? I could even export out of Resolve with the cubes and work directly with the colored footage.
Since I'm completely free to do whatever I want on this project and there are a million different ways, I was curious what the community would say the best way of working was.
I do have a reference which I am assuming is just the Alexa clips with the cube transform applied (the editor what given mov files with the lut already applied). So I'd like to work as close to what that looks like as possible.
Dear people of the comp,
I'm running our beloved software on pop_OS 22.04 and GNOME and it works like a charm, even when I switch to Wayland. I've been trying to run it on COSMIC DE either and it looks like it's working, except when I maximize a panel and then I press backspace again, I'm getting this result:
It's like Nuke cannot arrange its windows back. Is anybody encountering this error? Do you know how to solve it? I'm perfectly aware that COSMIC DE is still alpha, so I probably just need to wait for this to be patched.
Still, if any of you have found a workaround I'd be all ears.
Thanks in advance
What's the exact thing happened in these two frames, I wanted to replace my plain sky like a dark cloudy one. Also one of my friend has told there's a default sky replacement system is there in nuke. So can someone help me in replacing my sky like this ?
¿Can you give me any tips for Screen replacement in Nuke? About light around, track, etc. Thanks.
Hey guys, I was trying to understand when filtering actually happens, will it filter twice if I use 2 transform instead of 1? How many filters is okay? I’m confused so please bear with me a little. I am trying to understand what’s the best practice usually
I'm getting into Nuke and I'm working with a render that I rendered out of after effects (i have plugins there) and want to a different image sequence that has my zdepth pass in it. I basically want to use the zDepth pass of the original sequence onto my new render. Is that possible?
Hi
I've found a script on Nukepedia that would be of great help on 2d traditional animation project.
The animation frame steps aren't always even (mostly 2 still frames per images) but there are many exceptions.
Therefor using framehold (incremented by 2) wasn't always the solution when I wanted to adjust an animated roto for instance to the animation step.
Then I've found this : AnimHolder, a script that should facilitate the process !
https://www.nukepedia.com/python/misc/animholder
I've installed some Nukepedia scripts before, but this one doesn't appear anywhere. Is there something I'm missing ?
Thanks for anyone who wants to help :)
Also if someone shares the same problems i'm happy to discuss solutions together !
Hello guys!
For some reason CopyCat does not see the GPU and writes that the GPU is not available.
I'm using NVIDIA GeForce GTX 1070.
What could be the problem? I will be glad to help in solving it.
Hi everyone.
Is there a way to see the preview while writing?
Usually I have simple projects with some LUTing, so there are: read - ociotransform and then split to viewer and write nodes. When I hit F5, I see all the process in the nodes, but preview does not update during it, I can only see the progress bar popup and yellowing nodes. But no preview during write process. I there a way to update preview while writing/rendering?
the backplate and the clip are originally different resolutions. but im trying to just make the girl walk along the path. (dw about the lighting and roto, its just a quick mockup). i just want the shot on the left panel to show the whole backplate and not just cut off where the greenscreen clip ends.
For those of you who have used both, especially within nuke. Do you think one is better than the other and if so why. And what are some notable comparatables?
Hey. Having a real mare with NukeX 14.1v5 … copy works consistently but from time to time paste just stops. The only thing that clears it is a relaunch of the script - which might give me 20mins or 2hrs before it fails again.
I’ve cleared cache, killed localisation, changed keyboards, launched via terminal (nothing obvious in the logs) … it happens completely randomly on different nodes, etc. I thought it might be Flame confusing the situation so I tried to not open flame when using nuke … still no love.
M1 Ultra Mac Studio NukeX 14.1v5 MacOS 14
Any thoughts? Should I just bite bullet a go 15?
Based on one video, I'm trying to recreate a pipeline for emotion transfer using LivePortrait, but this uses a node that is not public. As the author describes, it just stabilizes and crops the image, (it takes the original image and cornerPin2d).
Can you please tell me how I could recreate this node? I tried exporting transform match move and center the image (in a 512 by 512 frame), but it doesn't seem to be the same as in the example