/r/virtualproduction

Photograph via snooOG

A community for the growing world of virtual production. Virtual production combines physical and virtual elements in real-time (often using game engines) to produce media such as films, TV shows, live events, and AR/VR content.

Virtual Production

CAD, CAM, CAE

RULES

  • Please do not post memes. A meme is a repeated joke involving a template photo with caption.

  • Please do not trade pirated materials. Talking about the subject is fine, but do not actually share any links.

  • Racism, sexism or any other kind of intolerance or discrimination will not be tolerated.

  • Trolling, posts intentionally inciting conflict, personal attacks, and spam will be removed.

  • Avoid posting blogspam or personally monetized links

  • Breaking the rules will result in your account being temporarily silenced or banned.

RESOURCES

CosmoLearning
MIT OpenCourseware
LearningSpace
Math
WolframAlpha
Khan Academy
Paul's Online Math Notes
Math Insight
PatrickJMT Video Math Tutorials
Math24
Electronics
All About Circuits
Circuit Lab
Programming
C++.com
StackOverflow
Mechanics and Materials
MatWeb
MecMovies
Thermodynamics and Related
Cambridge Materials Science DB
Cambridge Materials Science Videos
Cal Poly Pomona ME Videos
ChemEng
LearnChemE Screencasts

Other Subreddits
r/AerospaceEngineering
r/AskElectronics
r/AskEngineers
r/CAD
r/CAM
r/CAE
r/ComputerScience
r/Engineering
r/EngineeringMemes (Memes can be found here)
r/EngineeringTechnology
r/ECE
r/LaTeX
r/MatLab
r/STEMdents
r/WomenEngineers
r/FE_Exam

/r/virtualproduction

3,999 Subscribers

5

Questions on Render Node

We're a small VP studio with a 30'x12' LED wall. We are trying to ensure our render node is running the best it can. We've had some questions come up over the last year as far as best practices go, specifically relating to performance. We have two a6000 cards in the machine but we'll often find levels run with unusable frame rates for icvfx until a level is really pared down to the bare bones. Is this to be expected?

Also just looking for ways to test and get benchmarks. We've sometimes wondered if we are indeed using both GPUs and using them in the most effective way. I haven't been able to find definitive answers on nvlink, SLI, multi-gpu etc. so just wondering if anyone can weigh in on the matter.

Specs:

AMD Ryzen Threadripper Pro 5995WX 2.7GHz 64 Core, 256 GB RAM, 2x NVIDIA RTX A6000 48GB

10 Comments
2024/11/08
18:01 UTC

6

From Pre-Edit to Final Cut: Using Jetset for Efficient Filmmaking | Virt...

0 Comments
2024/11/08
02:15 UTC

1

Currently counting dead pixels on our film wall. Does anyone have any easy methods for this kind of thing?

Looking for resources like

-Pixel counting ruler

-pngs for seeing dead pixels

-anything else

5 Comments
2024/11/05
15:41 UTC

2

switchboard/ndisplay animation sequence playback issue

2 Comments
2024/11/05
15:06 UTC

3

Transferring Solved Lens Data from 3DEqualzier to Unreal's Lens File within UE 5.4/5.5?

Hi everyone,

I'm not 100% sure if my question covers the standard virtual production method/workflow since my interest is specifically with only the Lens File and Lens Component setups, and not relying on using additional live-action plates or LED wall panels.

I've been wondering if anyone is familiar with the process of transferring raw static and/or dynamic solved lens data that's from 3DEqualizer into Unreal's Lens File setup? There's very little information I've found about this topic online since it's not a real-time live link workflow directly within Unreal.

The goal I have in mind is to investigate what distortion parameters are transferable; Especially if the data is recorded across each frame for an image sequence. Whether that can cover lenses that are dynamically animating over time due to a change in focus pull, focal length, as well as lens breathing and/or re-racks if using anamoprhic lenses.

4 Comments
2024/11/02
08:52 UTC

2

nDisplay Light cards

Can someone please explain what these are used for?

3 Comments
2024/10/31
19:23 UTC

10

Basic training on Virtual Production

Hi folks, I'm looking for any free trainings, videos and documentation that give a broad overview of how a Virtual Production studio "works". Basics like genlock, video processors, LED arrays, etc and how they all work together is what I am looking for. I've been watching YouTube videos trying to learn what I can but wondering if anyone has any recommendations? Is there anything that covers the basics, VPS-101 type of thing?

A little background, my company's marketing department is setting up a VPS and my team (internal IT/AV) will be supporting them from time-to-time. I'd like me and my guys to learn some of the basics so we are all on the same page when we help out. Basics on motion tracking systems (mo-sys), how the signal flows from camera-unreal-video wall, how video processors (Brompton) work, etc. I'm not expecting us to walk away from watching some videos to be experts, but I want us to have a good feel for the process.

I would also like some of the managers and directors to go through these trainings so they have a better understanding of how this whole process works.

19 Comments
2024/10/31
15:44 UTC

1

Erklärung & Eigenes Konzept eines Virtual Production Studios

0 Comments
2024/10/31
13:19 UTC

2

Is this the correct way to set up Genlock settings for an LED Wall?

https://preview.redd.it/m3meargsm0yd1.png?width=3972&format=png&auto=webp&s=cc669544c1e9099d3242a5c4aa31bc608f735933

This image shows our current studio setup.

One PC is connected to an LED processor via HDMI, and the background is displayed on the LED WALL through N-Display.

We are currently using a GH4 as our test camera.

https://preview.redd.it/tizxdogsm0yd1.png?width=6546&format=png&auto=webp&s=dbd6cb0db86a0c6829425c6f17dcb8be3e892586

And this is the Genlock configuration diagram that I've studied and put together.

Is this the correct way to configure Genlock for an LED Wall

And I have another question.

Is a Quadro graphics card absolutely necessary for Genlock between the camera and LED WALL?

I understand that Quadro is needed when running N-Display with multiple computers.

However, since our studio runs N-Display with just one computer, we determined that we don't need a Quadro and built our computer with an RTX 4090 instead (Quadro is also too expensive).

11 Comments
2024/10/31
04:17 UTC

9

What specifications should a camera have for virtual production shooting with LED Walls?

I am preparing to open a virtual production studio in Korea.

https://preview.redd.it/ksw42tbhhvxd1.jpg?width=966&format=pjpg&auto=webp&s=e1d644814affaf5cbb8caa26be37520d24ac33d8

https://preview.redd.it/em9pdqsjhvxd1.jpg?width=4000&format=pjpg&auto=webp&s=0c1bb17a397c4e4498d31e542aa9d680d770373c

We are currently testing with a Panasonic GH4 camera, and the results are absolutely terrible.

The footage is so bad that we can't even tell if we're doing things correctly.

When we get even slightly closer to the wall, there's severe moiré, the colors look strange, and overall it's just terrible.

However, when some clients came to our studio and shot with Sony cameras, the results were decent (though this was shooting 2D video played on the LED wall, not Unreal Engine content).

Therefore, we feel it's urgent to establish what the standard specifications should be for cameras suitable for virtual production.

I don't think it's possible to get detailed camera recommendations from this Reddit post.

I would be grateful even if you could just give me a rough estimate of what level of camera would be suitable.

24 Comments
2024/10/30
10:53 UTC

3

Rendering out the outer frustum in HQ possible?

Hey there. In order to create a few "simpler" setups on our LED wall, we've been doing some UE 5.4 renders to put on the wall instead of doing live-tracking. (This of course means a fixed view without parallax and that's fine for this purpose) Is it possible instead of rendering one specific cinecam to render the (curved) LED Wall projection that's used for the outer frustum instead? Meaning, in that high quality that the movie render queue allows. That would probably work better in terms of more accurate display of the world...

Thanks for any advice!

4 Comments
2024/10/30
09:46 UTC

7

Is there still a need for aximmetry?

This might be not the smartest question but I'm serious here.

I've set up a virtual production with a green screen room. I'm using the vive mars setup, the BMD ultimatte 4k, and an otherwise all in UE5.4 setup which gets me all the way to a final composition over adi outs to the preview screen and I record takes to render out with path tracing afterwards.

What exactly does aximmetry do to lighten/ ease up the load? I see that it manages Hardware and tracking, can load scenes and key the green out, but is it still beneficial enough currently to pay the hefty price for it?

We're currently looking to optimize our studio to be more reliable although we are already in a pretty good spot, we get 50fps with scenes that are all Megascans and have foreground elements in front of the recorded person in the Greenscreen too.

I'm genuinely asking this because I can't find anything about aximmetry use for VP that's less than 2 years old. Two years ago the UE was wildly different when it comes to VP...

7 Comments
2024/10/29
10:04 UTC

17

We create any environnement for your Virtual Production need

Hello.

As the title says, we offer this service worldwide.

We are based in France and we have teams so we can scale and deliver pretty much anything remotely.

This allows us to collaborate with studios outside of France.

Quality is always photorealistic but how much really dépendent of your needs. We recreated the Eiffel tower, our dataset, but we also can give you a soccer field or the moon.

The video is a BTS of this clip where we delivered 6 environnements in 5 days : https://youtu.be/YDBIxhq6pH4

Since then, its been wonderfull time and hapiness mixed together. The last two were 4 environnements in ~48h optimisation included and 2 environnements (pretty complexe) in 72 hrs.

We can definitely deliver any standard but please let us more time if you guys call us, results will always be better.

8 Comments
2024/10/26
23:55 UTC

1

The Human Race - Short Film (2024) An in-depth profile into Astral Prime spaceship racing during the mid-21st century, as political and economic tension on Earth put the future of humanity at risk.

2 Comments
2024/10/25
23:42 UTC

12

Where can I purchase Unreal Environments for Virtual Production?

I am looking for some template scenes to use as backdrops. What online libraries cater to modern film backdrops?

21 Comments
2024/10/25
14:13 UTC

5

My First Music Video Production in UE (would love to read your feedbacks)

1 Comment
2024/10/25
11:03 UTC

1

XR production with a robotic crane for Olympic programs.

0 Comments
2024/10/21
03:22 UTC

10

How is this done: Image on Plane or Composer?

What are your thoughts?

4 Comments
2024/10/19
18:35 UTC

3

How could I work with unreal in the background?

Hi there!

I am doing a practice of capturing the movement of an avatar and sending it to OBS. And then compose it on a screenshot.

Until reaching OBS everything works perfectly, but when opening a game the motion capture is framed.

I have discovered what happens when a real stays in the background and minimized.

Would anyone know how to make unreal engine run normally even in the background? I have activated the options in Editor Preferences:

  • Use Less CPU when in Background

But without any change.

Looking at the Windows task manager I see that the GPU is at 100% usage while the GPU and RAM do not reach 50% of their capacity. How could I reduce the load?

Thank you so much

0 Comments
2024/10/15
11:02 UTC

4

Voyagers - Our Experimental Short Film Using Virtual Production & Motion Capture! Any Feedback?

Hello! We just wanted to share with the community our experimental svi-fi short film Voyagers: https://youtu.be/e4KUlrze5yc

This project was a hands-on learning experience where we explored the use of virtual production and motion capture technology in filmmaking. It’s the product of a professional development project by MBD’s team, a learning experience that helped us master new skills and push creative boundaries.

While making the film we documented every step, including the challenges faced and valuable tips on virtual production and motion capture so if you are curious to see the behind-the-scenes process check out our series ‘Voyagers: A Learning Journey’,: https://www.youtube.com/playlist?list=PLrDyEy6n2NULaxB1JoTfcfCaIJL6CsdSF

We'd love to hear your thoughts and feedback. Thanks for watching!

2 Comments
2024/10/15
09:04 UTC

1

STRECH PROBLEM IN NDISPLAY UE5

I have this problem with the nDisplay, it shows only on half of the LED wall. How can i fix it?

3 Comments
2024/10/11
23:32 UTC

1

Vive Mars tracking system is not detecting base stations.

I have a Vive Mars tracking system is not detecting base stations.

I have vive mars that is on firmware 3.1. It was workings, but then I have a big project so i stepped away from use it and i packed it up. Six months of pass and I have time now. I unpacked it and work on setting it how I had it.

The base stations were not being detected, so I turn them off and on. Same thing, so I turn both off and turn one one at a time. I didn't see any changes.

I move to turning one off and change the channel till works. I would channel on base station and it was not detected after 1min or 2. I would restart the the Mars itself.

I have updated and downgraded. I even ask my bud tp lend me his base stations and Mars unit to see if I was going crazy. It was working an hour ago. It also has issues with detecting his base stations and mine. I try his first base stations to verified to that it worked. So far no luck. I need help.

1 Comment
2024/10/11
01:17 UTC

6

UNREAL ENGINE LED WALL SETUP

I have a vive mars system and a LED wall and i am trying to use them for virtual production. I have the nDisplay is configured but i don t know how to send the live video from unreal engine on the LED wall

I have one PC setup, so i am using it for rendering and editor. I have a dual monitor setup, and the LED wall conected with HDMI, when launch the level in switchboard, it appears on my monitor and it has an UI as a touch controller and i don t know how to move it on the wall.

Is this the correct way to do this?

Any suggestions are welcomed.

7 Comments
2024/10/10
23:31 UTC

2

how to change the FOV & Focal Length in Inner-Frustum on LED Volume?

A question from you all. while changing Focal Length on our ICVFX camera, the inner-frustum on our led wall just resizes and we don't see any effects of stretching or squeezing the field of view in our inner-frustum (the thing that happens on real cameras when we change the lenses to a lens with different focal length)

is this the normal behavior of Unreal Engine Virtual Production or are we missing something? (all of our lenses are also calibrated properly)

Our Director of Photography insists that the perspective of the inner-frustum must change accordingly to different focal lengths, which it doesn't as you see in the video below. is there anyway we can actually & properly change the FOV of the inner-frustum projected on our LED Wall, in addition to it being resized?

https://reddit.com/link/1fzoksn/video/7o79zt66optd1/player

21 Comments
2024/10/09
10:52 UTC

3

Does NVIDIA's DLSS work with nDisplay?

When creating a project with high-quality Megascans assets and running nDisplay instances on an LED wall at 4K resolution, the frame rate drops, making the tracking noticeably slow.

It seems to be running at about 30 frames per second.

Of course, this might not be a big problem unless there are intense action scenes, but securing a higher frame rate still seems important.

So, I plan to use low and middle volume Megascans assets in the future and also actively utilize external solutions like DLSS.

7 Comments
2024/10/08
10:20 UTC

9

My first project- Kaun Talha || A music video teaser done via UE5.4.2

2 Comments
2024/10/08
03:32 UTC

6

Transitioning from VFX to Virtual Production. Tips?

Hi, I have been working in the film and TV VFX industry just over 3 years now, I started in tech pipeline and moved to being a VFX Editor + Data I/O. I have grown an interest in VP and the more I learn about it, the more I want to switch over to that side of the industry. I am based in London, UK.

I have been doing the free online training that has been created by Disguise, about Virtual Production Workflows. It includes a lot of the things that I know about such as colour management, codecs, etc but there are also new things that I am learning specially regarding the hardware side. The next thing I would probably learn is using Designer d3. I have experience using Unreal Engine but not in VP context, so that is on the list too.

Looking at the job roles available, I could most likely transition to being a DIT or in Data Management but I am looking forward to exploring Systems Design and Engine/volume/LED operators role.

What I am trying to understand is that what tools should I be learning, what training is required? Any tips regarding finding hands on experience either in the UK or outside? Also, whats your opinion on how the VP industry is doing compared to film and TV VFX due to external factors such as strikes, unions, etc? What are the day rates and salary like?

Thanks!

0 Comments
2024/10/06
09:28 UTC

4

VP Certification in LA

Fyi for anyone looking to get certified in virtual production, this is a 4-day intensive program… top instructors and facility, small class size. Oct 11-14 in Los Angeles. Taught by Synapse Virtual Production and Rochester Institute of Technology (RIT)

More info at:

https://virtualproduction.magic.rit.edu/masterclass.html

5 Comments
2024/10/03
22:56 UTC

5

How We Made A Short Film Using VP And Motion Capture As An Indie Studio

Hi everyone!

We just released the fourth and final episode of a behind-the-scenes series called "Voyagers: A Learning Journey": https://youtu.be/njiVXmu8BtQ

In this episode you will be taken behind the scenes to witness how we’ve put together video files, motion capture data and sound effects to bring to life our short film ‘Voyagers’ using Unreal Engine and DaVinci Resolve. You will see the creative team in action and hear how they managed to keep everything on track even when unexpected challenges popped up!

If you are a creative technologist, student filmmaker, or eager to learn the ins and outs of making a short film using virtual production and motion capture, you might find this series useful!

We are MBD, an award-winning UK-based arts organisation specialising in digital immersive storytelling using VR, AR and 360°.

Every comment is welcome! :)

0 Comments
2024/10/01
10:12 UTC

Back To Top