/r/colorists
This sub is for anyone involved in the process of coloring video. You can post links to articles, you can ask questions and you can ask for critique. We do skew towards professionals - mark yourself as a Novice unless you do this for a living.
This sub is for anyone involved in the process of coloring video.
You can post links to articles, you can ask questions and you can ask for critique.
/r/editors - Pro editors
/r/VideoEditing - Home video, newbie-oriented
/r/AfterEffects - obvious.
/r/vfx - Visual effects
/r/sfx - Special effects
/r/videography - All about cameras, rigging, etc.
/r/bmpcc - Sub dedicated to the Blackmagic Design Pocket Cinema Camera
/r/creativecommons - Ask for and post free content
/r/colorists
Hi, so i'm picture editor and am taking care of final delivery. I received the coloured footage in Pro-Res files. Set everything back up in premiere and got ready for final export. I referenced the pro-res files to my final export and the colour matched in QuickTime, in VLC Media Player and on YouTube. Of course, each video player the grade looked a bit different. I wasn’t involved in the colour session, so I’m not sure which player is most accurate but I assume VLC Player, even though this video is meant for web and social.
So, the director is saying the colour isn’t accurate. I’m assuming they mean in QuickTime it doesn’t look how it’s meant to, but if they open in VLC, then it does.
My question is, is there anything I can be doing on my end to make the footage look “accurate” on the export and is this an issue with premiere pro?
My understanding is the colourist graded with their software/monitors configured for broadcast/VLC media player (I think it’s 2.4) and that’s why it looks the most “accurate” in that software. But I also know people have issues with premiere pro.
Any ideas? Btw this is a professional colourist that works at a post-house for major commercials. So I assume they know what they’re doing.
Edit: Colorist delivered pro-res files in Rec709A (gamma2.4)
Thoughts?
https://flandersscientific.com/DM242
https://www.shopfsi.com/DM242-p/dm242.htm
Panel Specifications
Screen Size: 24"
Resolution: 1920x1200
Bit Depth (Color): 10bit (1.073 Billion Colors)
Contrast Ratio: 1800:1
Backlight : Wide Gamut W-LED
Pixel Efficiency: 99.999%
Max Luminance : 400nits
Viewing Angle: 179°
EDIT - added panel info from https://www.liftgammagain.com/forum/index.php?threads/fsi-announces-dm-242-monitor.19090
what’s the thing about zunzheng?
J’ai fait une série de vidéos qui explique comment créer des LUT et des Looks pour les caméras les pour les utiliser.
La première capsule porte sur le ARRI Alexa Classic
https://m.youtube.com/watch?v=XCkyX4eqqwU&list=PL2Wy2DLVqWZmfMJ6-xzbSa-zFH1nKfqHY&index=2&pp=iAQB
Je couvrirai les Alexa mini, LF, Alexa 35, Sony Burano, Venice et RED DSMC2 et 3 au cours des prochaines semaines.
I have a canon R5 Mark II and have multiple videos shot in 8K Raw and I have an M1 Max MacBook with 64GB memory.
I only edit in FCP because I had that already and I have been editing with the Rec2020 PQ project.
For the most part I've not used any LUT's but would like to and I tried to make some with Photoshop but most guides only cover how to make rec 709 LUT's.
Are there any guides on how to make an HDR LUT or any recommended LUT's for Rec2020 PQ or HLG?
My videos are mostly watched on Apple Devices with HDR Screens for what it's worth.
I've seen different workflows, some put NR right in the beginning or close to the beginning, for example after exposure, some put it last, still in DWG, before the last CST out, and I've seen some advocates of putting it in rec.709, after the CST out node, that way reducing noise in a more WYSIWYG way.
Same thoughts for Dehancer and film emulations. They're meant to work both in DWG and rec.709, but where do you prefer to put in your node tree and why ?
My best take has a gust of wind hitting the reflector on my guy's face. Surely there's a tool that could flatten tones across a human face by now?
So, the whole apple Gamma Shift issue has been tried and tested and beaten to death by many colorists for years, to the point I can work around it pretty easily (although sometimes time consuming). But I've been wondering, when does the color shift actually happen?
I changed to Mac before I started coloring, before I even started using Davinci and before Premiere even tried to include color management in their settings. So I don't know how exactly it works on windows.
I know that if I color for rec709, and tag it as rec709 on the output, it will run on any software as intended. But I also know that Youtube will yeet all my tags off when converting to their format and leave me with a different gamma, hence the rec709A fix. But to me that just means the "base" "untagged" video is operating on the crazy Apple gamma standard.
If, however, you're on a windows system, and you color for rec709 and leave the video untagged, what happens? Does the gamma stays 2.4? Does the gamma shift happen on the export?
Once again, the reason I'm asking is because I wanna know this: Would having a windows system make everything simpler for me? Would it put an end to these annoying work arounds or having to export different versions to go to Youtube? Or is exporting for YT + most viewing softwares also a hassle for y'all windows users?
Hi,
About me is low budget, new to color grading, mostly motion graphics on sRGB or rec 709 on Premiere pro, just got a Asus QD OLED monitor for all round purpose, *about to get EIZO 4K CG series(newer the better or CG2700X)* [or a LG C4 or G4] and last but not least would like to learn about being a colorist.
I have posted this since there seems to be few post on this topic that clarifies the confusion.
Now!
I've learned more than I thought about the base for color calibrating a monitor on this reddit.
I need a I/O, solid monitor, ColourSpace(SW) and a probe.
One thing I am so confused is about calibrating QD OLED.
Q.
Do I need a $7000 Jeti HighRes spectrometer for QD OLED monitors or would calibrtie's Displat Plus HL suffice?
Companies like Calibrite are saying calibrating on a QD OLED with $300-$500ish colorimeters would suffice while others argue a $7000 Jeti 2nm high Res spectrometer would be the least requirement to do an accurate calibrating for QD OLED or things will be off.
I see that FSI has made a QD OLED reference monitor so there seems to be a technical approval about using QD-OLED for color grading and there must be a good method to calibrate a QD OLED monitor.
**If there are experienced who could share or clear the myth around QD-OLED monitor for colour calibrating
that would help newbies a lot.**
PS
I get ColourSpace is a solid SW and would be getting the PRO version in the future.
But people seem to have very different takes on QD-OLED and calibrating on it that
it seems absurd for starters that a $7000 probe is necessary. I mean then what would others like consumers
going to do when they want calibrate their QD OLED tvs? Call a specialist to calibrate QD OLED TVs every six months?
Hi, is there any converter that supports 2K DCI 2048x1080 as HDMI output? I only found the AJA Hi5-Plus that says it allows 2K 1080p but is a bit unclear if that also translates to the HDMI output. Thank you
Hello everybody, I have an issue at hand and it's the fact that I just bought a Calibrite Display Plus HL to calibrate my monitor but I never asked myself before if and what I need to tweak in my monitor's settings in order to make a good calibration.
I'm a newbie to color calibration and I just bought this supposedly really good color calibration probe but I'm unsure how to go from here. I don't think the color cal software (Calibrite Profiler) and the monitor's settings communicate to each other so I don't know if I should maybe raise the brightness, tweak the contrast, set a "costum preset color space" before starting...?
I have different color spaces to choose from installed in my monitor, including sRGB, EBU, SMPTE-C, REC709 and DICOM SIM and they're all very different from each other... Should I select one in particular before starting the process of calibration?
I'm at a loss and I wouldn't want to block my own monitor's capabilities because I set it up wrogfully before calibration.
My monitor is a Viewsonic vp2768 2k display and I work on Windows 11.
Any help would be much appriciated, even if it's just a theory lesson about color calibration I'd gladly welcome it
Allô, est-ce qu’il y a des coloristes francophones sur ce sub? Dites-moi, j’aurais des trucs à partager.
For anyone who has purchased this course, I have a few questions, if you don't mind answering them :
- Are there things in there that you really can't find online in various pro youtube channels, specifically the youtube channels of the teachers in question ?
For example, if I watched all of Cullen Kelly's videos, is there something in his segment that will give me something valuable, more than he already gave away for free on his channel ? (Same for Darren Mostyn and other people)
- Is it more geared towards advanced and expert users with some inside secrets about useful pro techniques ? Or does it only touch the surface of the knowledge you can find online ? I imagine that just 1 hour or so per teacher is not much to go really in depth in some hard topics ?
I'm certain every people teaching in this course is very very expert in his field, that I don't doubt, I'm just wondering if the knowledge inside that specific course is worth the money ?
Hi everyone,
I'm looking to switch to using the new Node Stack Layers in Resolve 19 as I feel it's a nice improvement to my workflow. I am interested in your experiences with the new feature and how you are using it. Have you found any limitations or things to watch out for?
My idea is to use groups to colour manage different cameras and then have 3 node stack layers for primaries, secondaries and look dev. I've never liked creating looks on a post group as you can't really use the version feature, or at least I haven't found a way other than grabbing stills.
I look forward to hearing how you use the layers.
Never mind. I got it, but I don't know why it didn't work before.
I just purchased this and downloaded and installed the Calibrite soffware. but it comes up in demo mode and I can't find how to turn that off. I would RTFM but there apparently is no FM. Anyone know what I'm missing?
Thanks.
If one has footage; such as, Arri Raw Open Gate 3424x2202 and edited in a cropped "wide 2.39 look" of 3424x1432, would you make your Resolve/grading timeline the same? Or would you make your timeline at source resolution and crop with the output blanking option?
Something that's making me question this is (1) I've never seen a deliverable at this aspect ratio or resolution and (2) the Decklink 4k Mini Mon doesn't display 3424x1432.(it has to be put in a timeline w a resolution it supports and scaled accordingly).
So, what would you set your grading timeline resolution to??
Thank you in advance!
After seeing good reviews about this ebook I purchased this deal that I received as an email by subscribing Cullen Kelly, but I am not able to actually download the ebook after payment. There is no obvious error on website. It just simply does not respond. Wondering if anyone of the colorist also in the same boat or possible solution ? I did send out an email to their support email with the invoice attached but no response so far.
We've pointed you at this thread rather than you ask about your specific monitor request in the main subreddit.
No, you can't just connect a generic monitor.
We're going to talk to you as a professional. This means, no, the "workarounds" are a total compromise. In those cases, you're on your own.
This is about creating a trusted reference - not just what you think looks good. And yes, the client's screen(s) could be all out of whack. And yes, we're talking web too.
Brands that are reliable and (professionally) inexpensive:
If you're going to compromise, here's our best advice:
No matter what the manufacturer says was done at the factory, you will need to calibrate your displays regularly.
I want to know if this particular brand of wide gamut/p3/sRGB monitor is up to snuff*.*
It's not. Without the hardware/probe and the ability to load a LUT, forget it.
Can I just calibrate a monitor, it's just going to the web.
Same problem. Without a probe, you don't know what you have.
Ok, I have a probe.
You still need a breakout box - something to get the OS out of the way.
The idea here is a confidence monitor. Something you know you can have confidence in.
OK, I have a probe and a BMD Mini-Monitor. Am I good?
Not unless you can generate and load a LUT into the monitor.
Really? What do I need to buy now?
A LUT box will solve this. The monitor still may be junk, but you have a clean signal chain.
Great, I'll just buy a C8/9/X from LG, people talk about that all the time.
That's a good client monitor. And great that you have a breakout box and probe. This is useable if you're starting off into HDR - but just know, it's not to be trusted.
What about my iPad Pro? Apple tells me it has Wide Gamut
An iPad Pro is an excellent way to check Apple devices. It's well designed out of the factory.
Plugging your system through it (via Sidecar, Duet display) puts us back in the "OS interference" level. But it's good for a check of the materials - just not so good for live grading.
Last, check out these three prior posts:
-----
Let's see how this thread goes and we'll refine as we go.
Let's say you work in node-based YRGB.
You can control your tone mapping (luminance 10000 nits), as well as your OOTF on each in and out node.
I think the standard here is to keep the OOTF off when going from log to DWG, then forward OOTF on when going from DWG to rec.709 ? What about inverse OOTF ?
However, when you work in YRGB color managed in custom. Same color spaces specified in project settings for input, timeline and output (log to DWG to rec.709), you cannot select your OOTF the way you want it as well as your tone mapping ?
Does the software do a good job determining these settings on its own, or would you reap more benefit from selecting those yourself, and work in YRGB ?
These things are still unclear to me, thank you for explaining.
Exposure -> White Balance -> Saturation in serial nodes. That is a very popular workflow recommendation for DaVinci Resolve.
But is there any reason why I should not be setting it up as parallel nodes instead, since each of the nodes is getting a clean feed directly from the source? Why and why not?
Hi everyone,
I am trying to find a fabric that is close to GTI N5 paint used in grading suites. I know neutral grey is technically good enough but I want to see if I can find something spectrally flat if possible.
If anyone has a spectrometer or spectrophotometer and access to a grading suite painted with GTI N5 paint could you take a reading of the GTI N5 paint?
hi guys! im a gaffer by trade and have a fair bit of down time at the minute during the winter film industry lull... I'm trying to use my time effectively and have been doing some shot deck digging for a project I have next year, however a feature that shot deck doesn't have is false colour (as far as I can tell) I just want to be able t o pull in a frame and bring up false colour so I can work out their contrast ratios etc.
I downloaded resolve but you need the studio version for false colour and paying £200 seems a bit steep for my needs.
I was looking at the time in pixels false colour plugin. does anyone have any experience? I like the fact it can do Flanders colours (as I have a Flanders, although don't keep it at home). can you run plugins on the free version of resolve?
any other suggestions?
Working on a TVC, and production have asked to get a film out done.
I was wondering what are the advantages and best practice when it comes to to doing a film out.
Would you grade then send the finished spot for a film out. Or balance the shots, do the film out then apply a look onto the film out scans.
Also any watch outs?
Hey All, beginner ish here.
I'm a trainee/camera assistant that is trying to slowly make a move into shooting my own stuff, which I also gonna be grading.
I'm going on a bit of a rabbit hole, I'm looking at investing in a display to use as a reference monitor, and currently looking at the LG C2 (although kinda big for my desk lol) cause of how well spoken it is on the internet.
Now, I know that it has to be calibrated and its not calibrated from default.
I use a mac and just discovered that I should be on windows to use either Calman LG or Colour Space.
I luckily can get access to a windows PC but before spending money I'd like to be sure on what to purchase.
I tried using displaycal in the past and its looking very technically complicated for me, I understand whats it doing but find it difficult to understand everything its saying/asking me.
unfortunately got very little help online to try and stick with it.
This considered, would Calman LG get me an accurate looking image without getting too mad trying to understand the software or Color space would be better?
I have a bunch of short that are gonna be picture locked soon and would like to get started asap.
Thoughts? Opinions? Tips? Any help is massively appreciated
Using a macbook pro M1
I do own both ultrastudio monitor 3g and a bmd bidirectional switch that also can hold luts
Have access to a windows pc
Current monitor is an LG 27UP850 (still kinda having troubles calibrating it icl, currently calibrate with LG calibration studio, yes aware that it isn't liked as a software)
xrite i1 probe
Hi,
Just curious if anybody has ever had a use case for this tone mapping method ? To my eyes it looks bad and i probably wouldn't want it included in a rendering transform but maybe that's just me ?
Cheers
Here is my situation :
I will edit and grade footage coming from 1 camera only.
For color management, am I correct in assuming that setting the color science to "Davinci YRGB Color managed", unchecking the "automatic color management", using a "custom" color processing mode and in there selecting my specific camera input color space, my timeline color space as Davinci WG/intermediate, as well as my output color space to Rec.709 Gamma 2.4 is exactly the same as working with a "Davinci YRGB" color science with the same settings, it's just that the input and output CST nodes would be shown, instead of hidden with the first method ?
It is my understanding that the result is exactly similar, but if you only work with 1 camera (so, 1 input color space), you'd be better off just using the first method, YRGB color managed with the settings I mentionned ?
Or in my situation, do I have any advantage in using JUST YRGB and dabbling with input and output CST nodes on every clip ?
I have another question regarding this topic : to insert a LUT, if I work with the first method, am I correct in assuming that the whole node tree is considered to work in Davinci WG, and anything I use in that tree (like a LUT), should be set to input from and output to Davinci WG/intermediate ? With a CST in between to change the color space to the specific camera the LUT is using ?
Thank you for clarifying this matter to me... I am watching a lot of Cullen Kelly to try to understand this, but I haven't found the answer yet.
Hello, i received 4k h264 footage (8 bit, 4:2:0 slog2) from a sony camera (a7sII), i was wondering what is the best codec to transcode to, i tried dnxhr hq but it's too much space for my pc right now, what's the best codec for editing 4k? i'm using shutter encoder. Is prores LT good for windows? thanks
I've heard a lot about how it's best to bring over your premiere files to resolve to grade it. However, my current job has a lot of re-editing involved, so is it possible to keep on doing that process of swapping from one to the other, or is it best to just focus on editing and coloring in premiere for this job?
Hello All,
i was looking at the LG C2 but its way too big for my desk setup considering i also have another 27 inch monitor. Any smaller oled options?
what i use:
bmd io box (cheapest one)
macbook pro m1
xrite i1
Hi,
I updated this week from Resolve 18.6.6 to 19.1 and since then i'm having issues with the jog wheel of my Wacom Intuos Pro M.
On 18.6.6 and previous Resolve versions i was able to use the jog wheel to manipulate float sliders inside the ofx panel smoothly, since 19.1 this behaviour changed and i'm not sure if this is intended or if there is a fix.
Now when i use the jog wheel on a float slider i'm only able to move between whole number values, for example if a slider goes from -2.0 to +2.0 i'm only able to slide between -2.0;-1.0;0.0;+1.0;+2.0. Meaning i don't have access to intermediate values anymore and i have to use pen or mouse for this to function normally. This doesn't seem to affect integer sliders, only float sliders.
I tried uninstalling the tablet's drivers and reinstalling but no luck there and they are up to date.
The jog wheel functions flawlessly everywhere else on Resolve and on other softwares so i'm guessing something changed with the release of 19.1.
Does anyone has noticed a similar behaviour and maybe knows how to fix it or do i have to wait for the next release and hope they will have fixed this by then ?
Thanks in advance
Mac Studio Ultra M1
MacOs Sequoia 15.1.1
Resolve Studio 19.1 build 12
Wacom Pilot Drivers 6.4.6-3