/r/virtualproduction
A community for the growing world of virtual production; a technique that combines physical and virtual elements in real-time (often using game engines) to produce media such as films, TV shows, live events, and AR/VR content.
Virtual Production
CAD, CAM, CAE
RULES
Please do not post memes. A meme is a repeated joke involving a template photo with caption.
Please do not trade pirated materials. Talking about the subject is fine, but do not actually share any links.
Racism, sexism or any other kind of intolerance or discrimination will not be tolerated.
Trolling, posts intentionally inciting conflict, personal attacks, and spam will be removed.
Avoid posting blogspam or personally monetized links
Breaking the rules will result in your account being temporarily silenced or banned.
RESOURCES
CosmoLearning
MIT OpenCourseware
LearningSpace
Math
WolframAlpha
Khan Academy
Paul's Online Math Notes
Math Insight
PatrickJMT Video Math Tutorials
Math24
Electronics
All About Circuits
Circuit Lab
Programming
C++.com
StackOverflow
Mechanics and Materials
MatWeb
MecMovies
Thermodynamics and Related
Cambridge Materials Science DB
Cambridge Materials Science Videos
Cal Poly Pomona ME Videos
ChemEng
LearnChemE Screencasts
Other Subreddits
r/AerospaceEngineering
r/AskElectronics
r/AskEngineers
r/CAD
r/CAM
r/CAE
r/ComputerScience
r/Engineering
r/EngineeringMemes (Memes can be found here)
r/EngineeringTechnology
r/ECE
r/LaTeX
r/MatLab
r/STEMdents
r/WomenEngineers
r/FE_Exam
/r/virtualproduction
I need a big green screen with multiple walls but the StudioLink seems like an expensive choice.
Looking to break into the virtual production industry. I have been studying unreal engine and 3d modelling for 8 months now. I’m curious as to what projects I should have in my portfolio to land an entry role in VP?
I'm getting back into using Vive trackers and UE5 for some virtual production and one of the trickiest things to get dialed in is making sure that the vive tracker is synced with the virtual camera properly. To be more specific, my vive tracker is mounted a few inches above where my camera's sensor is. So the rotation of my virtual camera isn't going to match because it will be offset on the y axis and possibly the z axis.
I remember there being some calibration software that helped with this but can't remember the name.
Hey Folks! We're buildung a small-scale VP Studio using short-throw beamers. My question is: do you know if it's possible to use 3 beamers instead of a curved wall to achieve 180 degree view angle through the real camera?
I'm aware of the problem of changing distance from camera to screen - would it be possible to solve that issue through a nDisplay mapping with corresponding distortion in the corners?
The screens will always be out of focus for our purpose.
If anyone has got insight in that idea it would be much apprechiated! Thanks guys :D
Need some help with keyer and cyclorama of the new Reality5.3 SP2 release. Hub version 1.5.
Navigating the exciting world of outsourcing: whether you're a newcomer to the game or consider yourself a seasoned pro, 80 Level wants to hear from you with an all-new, brief, 4-minute survey designed to gather invaluable insights.
Why join?
Participate in this exciting journey, start the survey: https://80level.typeform.com/research
The results are going to be shared in 3-4 weeks. Thank you for adding your voice!
Here's a high-level overview of the desired workflow exemplified:
https://youtu.be/DQT0Qy856mA?si=G6hksL8v2GEGPpfJ&t=379
Here's a detailed technical step by step example of the workflow using Unreal:
https://www.youtube.com/watch?v=J2pnk97zIDg
I'd be grateful to learn from and to share a tutorial on set alignment (real world with virtual world). There is no Unity focused tutorial addressing this workflow. Do you have the knowledge and insight to create a tutorial and share it with the internet?
Currently, I'm using iOS iPhone and Unity's virtual camera app to sync with and control a cinemachine virtual camera. I don't have a workflow for aligning our real world with the virtual Unity environment.
With regard to translating the above workflow from Unreal to Unity:
Note:
If this workflow doesn't translate; can you recommend an apt Unity Engine workflow replacement for accomplishing the same outcome?
I wanted to set up a virtual studio and originally was just going to go with green screen and markers, but after my wmr headset died I started looking into lighthouse and found out it was used for tracking jn video production. Which I never knew. However it seems all the videos using it are three years old, and that people who used to make content for video production using vice trackers have moved on for different solutions (all too expensive for me, above 1k) or aximmetry. However I swear everyone changing to aximmetry seems to be sponsored by aximmetry... So I don't know. What would you guys recommend? Buy lighthouse setup or go a different way?
Just starting with virtual production. I have access to a canon r5 with the 180 lens. I'm wondering if anyone has experimented with using this in vmix for 3d backgrounds, or for creating gaussian splats? I know this isn't the traditional process but just wondering what the effect would be...
Can you achieve a proper camera alignment and lens calibration with the Mars tracking system without using nDisplay or the other ICVFX plugins? Like if I have a chair and an accurately scaled model in my unreal scene will it ever stick when translating the camera?
Has anybody had a issue with the rover to where I’d just goes between gray and white back and forth on the main display and never turns blue for tracking?
Hi, I don't really know if I'm explaining this well. I'm pretty new to VP so forgive me if my terminology is wrong.
In this video, you can see the virtual background moving away as the actor walks
https://www.youtube.com/shorts/Lg4lRcnyhL4
I understand (mostly) nDisplay and integration with Mosys, and I have a setup where I can shoot moving shots with the frustum + parallax, but in the linked scene it seems like the camera is pretty still, yet the background itself is moving backwards to add to the illusion that she's walking through a forest.
How would one go about accomplishing this? Did they attach the nDisplay to a camera rig rail or something?
Thanks in advance!!!
I've been searching through internet in order to understand this small VMC-BNCM1 -cable. It seems so that it will allow the Sony FX3 to get genlocked but I am not certain.
Can somebody help me with this. If I want to use FX3 for VP is this the magic cable for genlocking my camera?
Hello I manage equipment for a small TV studio. We recently got an LED wall, with a novastar vx16s controller.
No matter what hz, fps, and shutter combos I use, I always get horizontal scan lines across the screen on my camera whenever I do a tilting motion of any kind. No tilt, and the image is fine and the screen looks great.
I've tried using a Red Helium, Canon C500MKII, and a Ursa G2... same results across the board.
Any ideas on what's causing this and how to fix? Again it's only with tilting. Pans are fine.
student here, so not starting with the deepest knowledge. I'm trying to set up this https://docs.unrealengine.com/4.27/en-US/BuildingWorlds/VRMode/
But I've got the oculus quest 2. where it says the rift is the supported headset is that outdated by now or is the quest actually incompatible?
Cheers
Hi, I am new to virtual production. I am about to make some music videos where I film myself, rotobrush myself out, then use camera tracking software to get decent camera solve. Maybe I'll do that in Blender. In most of these programs, you can specify real camera settings to help get a more accurate solve.
But I can't figure out what MFT, cropping, etc imply for these settings.
For instance. I have Lumix GH5. It has sensor size 17.3 mm x 13 mm.
I shot test footage at 21 mm focal length. I chose 1080 25p output.
Now, the camera sensor has ratio 1.4. But the file is of course 1.777.
Do I just tell the solver to use 1.4 ratio? That is assume the camera is seeing what the sensor sees and that the 1080 result is simply after the fact?
My goal is to get the most accurate solve, so really trying to figure this all out.
If there is a more fitting group for these sorts of questions, please let me know!
thanks,
Brian
I have unlimited access to an LED Wall through my work, and I want/need to set up a virtual production system with it. I can't find any tutorials that cover all of this, and trying to piece together separate tutorials with different initial setups is getting confusing and not yielding the results I would hope for. I'm hoping to find a start to finish tutorial.
I am utilizing the HTC Mars system for tracking, and Blackmagic Ursa Minis for the cameras.
Apparently Silver Spoon used Unreal Engine with body tracking by Xsens to pull off the real-time SpongeBob and Patrick sports commentary. Buzz online is that it was a major hit. Expect to see more computer animation crossover with live TV as virtual production matures. Exciting times.