/r/virtualproduction

Photograph via snooOG

A community for the growing world of virtual production; a technique that combines physical and virtual elements in real-time (often using game engines) to produce media such as films, TV shows, live events, and AR/VR content.

Virtual Production

CAD, CAM, CAE

RULES

  • Please do not post memes. A meme is a repeated joke involving a template photo with caption.

  • Please do not trade pirated materials. Talking about the subject is fine, but do not actually share any links.

  • Racism, sexism or any other kind of intolerance or discrimination will not be tolerated.

  • Trolling, posts intentionally inciting conflict, personal attacks, and spam will be removed.

  • Avoid posting blogspam or personally monetized links

  • Breaking the rules will result in your account being temporarily silenced or banned.

RESOURCES

CosmoLearning
MIT OpenCourseware
LearningSpace
Math
WolframAlpha
Khan Academy
Paul's Online Math Notes
Math Insight
PatrickJMT Video Math Tutorials
Math24
Electronics
All About Circuits
Circuit Lab
Programming
C++.com
StackOverflow
Mechanics and Materials
MatWeb
MecMovies
Thermodynamics and Related
Cambridge Materials Science DB
Cambridge Materials Science Videos
Cal Poly Pomona ME Videos
ChemEng
LearnChemE Screencasts

Other Subreddits
r/AerospaceEngineering
r/AskElectronics
r/AskEngineers
r/CAD
r/CAM
r/CAE
r/ComputerScience
r/Engineering
r/EngineeringMemes (Memes can be found here)
r/EngineeringTechnology
r/ECE
r/LaTeX
r/MatLab
r/STEMdents
r/WomenEngineers
r/FE_Exam

/r/virtualproduction

3,476 Subscribers

1

Is there an affordable way/product to do large size green screen like the Manfrotto StudioLink... but like, more affordable?

I need a big green screen with multiple walls but the StudioLink seems like an expensive choice.

0 Comments
2024/04/03
01:26 UTC

5

Any advice on what kinds of projects I can work on to get a job in virtual production as an unreal engine artist?

Looking to break into the virtual production industry. I have been studying unreal engine and 3d modelling for 8 months now. I’m curious as to what projects I should have in my portfolio to land an entry role in VP?

4 Comments
2024/04/02
10:32 UTC

4

First scene of our horror-thriller Feature Film titled "Awake" using mocap, Unreal Engine, and Metahumans (still a little "uncanny valley" but we're getting there)

0 Comments
2024/04/01
02:35 UTC

1

Unreal Engine Automotive Masterclass

0 Comments
2024/03/31
10:03 UTC

2

Wasn't there a software or plugin for UE5 that helped with calibrating a vive tracker mounted on a camera with the UE virtual camera?

I'm getting back into using Vive trackers and UE5 for some virtual production and one of the trickiest things to get dialed in is making sure that the vive tracker is synced with the virtual camera properly. To be more specific, my vive tracker is mounted a few inches above where my camera's sensor is. So the rotation of my virtual camera isn't going to match because it will be offset on the y axis and possibly the z axis.

I remember there being some calibration software that helped with this but can't remember the name.

0 Comments
2024/03/28
11:51 UTC

6

Virtual Production from a BARN? Indie-Level Virtual Production on Greenscreen | Nuke | Blender | Lightcraft Jetset

4 Comments
2024/03/27
13:11 UTC

2

3 beamers instead of curved screen

Hey Folks! We're buildung a small-scale VP Studio using short-throw beamers. My question is: do you know if it's possible to use 3 beamers instead of a curved wall to achieve 180 degree view angle through the real camera?

I'm aware of the problem of changing distance from camera to screen - would it be possible to solve that issue through a nDisplay mapping with corresponding distortion in the corners?

The screens will always be out of focus for our purpose.

If anyone has got insight in that idea it would be much apprechiated! Thanks guys :D

9 Comments
2024/03/27
09:42 UTC

3

Virtual Production Action Short

1 Comment
2024/03/26
19:51 UTC

1

Any Zero Density experts here?

Need some help with keyer and cyclorama of the new Reality5.3 SP2 release. Hub version 1.5.

0 Comments
2024/03/25
22:06 UTC

2

Share Your Outsourcing Experiences: Take Our Quick Survey and Help Shape the Future!

Navigating the exciting world of outsourcing: whether you're a newcomer to the game or consider yourself a seasoned pro, 80 Level wants to hear from you with an all-new, brief, 4-minute survey designed to gather invaluable insights.

Why join?

  • Reflect on Your Journey: Review your experiences in the game development outsourcing industry.
  • Contribute to Growth: Share your insights to help build a collective understanding.
  • Gain Unique Perspectives: Access the results to see a wide range of experiences from our community, enriching your view.

Participate in this exciting journey, start the survey: https://80level.typeform.com/research

The results are going to be shared in 3-4 weeks. Thank you for adding your voice!

0 Comments
2024/03/25
20:31 UTC

3

Real & Virtual Set Alignment using Unity for Virtual Production?

Here's a high-level overview of the desired workflow exemplified:
https://youtu.be/DQT0Qy856mA?si=G6hksL8v2GEGPpfJ&t=379

Here's a detailed technical step by step example of the workflow using Unreal:
https://www.youtube.com/watch?v=J2pnk97zIDg

I'd be grateful to learn from and to share a tutorial on set alignment (real world with virtual world). There is no Unity focused tutorial addressing this workflow. Do you have the knowledge and insight to create a tutorial and share it with the internet?

Currently, I'm using iOS iPhone and Unity's virtual camera app to sync with and control a cinemachine virtual camera. I don't have a workflow for aligning our real world with the virtual Unity environment.

With regard to translating the above workflow from Unreal to Unity:

  • Unreal Blueprint > Calibration Point. What is the Unity equivalent?
  • Unreal Lens Calibrator > What is the Unity equivalent?

Note:
If this workflow doesn't translate; can you recommend an apt Unity Engine workflow replacement for accomplishing the same outcome?

2 Comments
2024/03/24
03:33 UTC

4

Is vive lighthouse still the best low budget tracking?

I wanted to set up a virtual studio and originally was just going to go with green screen and markers, but after my wmr headset died I started looking into lighthouse and found out it was used for tracking jn video production. Which I never knew. However it seems all the videos using it are three years old, and that people who used to make content for video production using vice trackers have moved on for different solutions (all too expensive for me, above 1k) or aximmetry. However I swear everyone changing to aximmetry seems to be sponsored by aximmetry... So I don't know. What would you guys recommend? Buy lighthouse setup or go a different way?

19 Comments
2024/03/19
19:05 UTC

2

Stereoscopic 180 video uses?

Just starting with virtual production. I have access to a canon r5 with the 180 lens. I'm wondering if anyone has experimented with using this in vmix for 3d backgrounds, or for creating gaussian splats? I know this isn't the traditional process but just wondering what the effect would be...

1 Comment
2024/03/16
21:55 UTC

3

Gaussian Splatting! - Computerphile

1 Comment
2024/03/15
01:58 UTC

0

Unreal LED Volume camera Line up

Can you achieve a proper camera alignment and lens calibration with the Mars tracking system without using nDisplay or the other ICVFX plugins? Like if I have a chair and an accurately scaled model in my unreal scene will it ever stick when translating the camera?

1 Comment
2024/03/14
23:37 UTC

5

Voice Activated Rokoko + Live Link Simultaneous Capture Workflow

0 Comments
2024/03/14
03:13 UTC

2

Vive Mars rover issue

Has anybody had a issue with the rover to where I’d just goes between gray and white back and forth on the main display and never turns blue for tracking?

2 Comments
2024/03/07
01:11 UTC

3

How to make UE5 nDisplay move as irl actor walks?

Hi, I don't really know if I'm explaining this well. I'm pretty new to VP so forgive me if my terminology is wrong.

In this video, you can see the virtual background moving away as the actor walks

https://www.youtube.com/shorts/Lg4lRcnyhL4

I understand (mostly) nDisplay and integration with Mosys, and I have a setup where I can shoot moving shots with the frustum + parallax, but in the linked scene it seems like the camera is pretty still, yet the background itself is moving backwards to add to the illusion that she's walking through a forest.

How would one go about accomplishing this? Did they attach the nDisplay to a camera rig rail or something?

Thanks in advance!!!

7 Comments
2024/02/26
20:18 UTC

2

FX3 for Virtual Production

I've been searching through internet in order to understand this small VMC-BNCM1 -cable. It seems so that it will allow the Sony FX3 to get genlocked but I am not certain.

Can somebody help me with this. If I want to use FX3 for VP is this the magic cable for genlocking my camera?

1 Comment
2024/02/23
08:59 UTC

5

Horizontal scan lines

Hello I manage equipment for a small TV studio. We recently got an LED wall, with a novastar vx16s controller.

No matter what hz, fps, and shutter combos I use, I always get horizontal scan lines across the screen on my camera whenever I do a tilting motion of any kind. No tilt, and the image is fine and the screen looks great.

I've tried using a Red Helium, Canon C500MKII, and a Ursa G2... same results across the board.

Any ideas on what's causing this and how to fix? Again it's only with tilting. Pans are fine.

13 Comments
2024/02/22
19:40 UTC

1

quest 2 virtual camera

student here, so not starting with the deepest knowledge. I'm trying to set up this https://docs.unrealengine.com/4.27/en-US/BuildingWorlds/VRMode/

But I've got the oculus quest 2. where it says the rift is the supported headset is that outdated by now or is the quest actually incompatible?

Cheers

1 Comment
2024/02/22
14:20 UTC

3

When using camera tracking software, what to specify for MFT cameras (film size, focal length, etc)?

Hi, I am new to virtual production. I am about to make some music videos where I film myself, rotobrush myself out, then use camera tracking software to get decent camera solve. Maybe I'll do that in Blender. In most of these programs, you can specify real camera settings to help get a more accurate solve.

But I can't figure out what MFT, cropping, etc imply for these settings.

For instance. I have Lumix GH5. It has sensor size 17.3 mm x 13 mm.

I shot test footage at 21 mm focal length. I chose 1080 25p output.

Now, the camera sensor has ratio 1.4. But the file is of course 1.777.

Do I just tell the solver to use 1.4 ratio? That is assume the camera is seeing what the sensor sees and that the 1080 result is simply after the fact?

My goal is to get the most accurate solve, so really trying to figure this all out.

If there is a more fitting group for these sorts of questions, please let me know!

thanks,

Brian

0 Comments
2024/02/21
17:05 UTC

1

My friend and I made a 0$ Shortfilm using Unreal Engine

4 Comments
2024/02/21
13:36 UTC

21

Our LED studio holds a monthly event and we shoot small test scenes live. The one from last week turned out great!

6 Comments
2024/02/20
17:04 UTC

6

Are there any full LED Wall virtual production tutorials for Unreal?? Bonus if they use HTC Mars.

I have unlimited access to an LED Wall through my work, and I want/need to set up a virtual production system with it. I can't find any tutorials that cover all of this, and trying to piece together separate tutorials with different initial setups is getting confusing and not yielding the results I would hope for. I'm hoping to find a start to finish tutorial.
I am utilizing the HTC Mars system for tracking, and Blackmagic Ursa Minis for the cameras.

7 Comments
2024/02/20
15:09 UTC

3

360 Stereo Media Player in UE5

0 Comments
2024/02/20
11:51 UTC

2

Tunel Zurqui, Costa Rica, Unreal Engine 5

0 Comments
2024/02/12
21:09 UTC

13

Last nights Super Bowl was a major moment for virtual production as SpongeBob and Patrick took over as real-time rendered cohosts. Congrats to the team at Silver Spoon for pulling it off

Apparently Silver Spoon used Unreal Engine with body tracking by Xsens to pull off the real-time SpongeBob and Patrick sports commentary. Buzz online is that it was a major hit. Expect to see more computer animation crossover with live TV as virtual production matures. Exciting times.

0 Comments
2024/02/12
15:00 UTC

Back To Top