/r/virtualproduction

Photograph via snooOG

A community for the growing world of virtual production. Virtual production combines physical and virtual elements in real-time (often using game engines) to produce media such as films, TV shows, live events, and AR/VR content.

Virtual Production

CAD, CAM, CAE

RULES

  • Please do not post memes. A meme is a repeated joke involving a template photo with caption.

  • Please do not trade pirated materials. Talking about the subject is fine, but do not actually share any links.

  • Racism, sexism or any other kind of intolerance or discrimination will not be tolerated.

  • Trolling, posts intentionally inciting conflict, personal attacks, and spam will be removed.

  • Avoid posting blogspam or personally monetized links

  • Breaking the rules will result in your account being temporarily silenced or banned.

RESOURCES

CosmoLearning
MIT OpenCourseware
LearningSpace
Math
WolframAlpha
Khan Academy
Paul's Online Math Notes
Math Insight
PatrickJMT Video Math Tutorials
Math24
Electronics
All About Circuits
Circuit Lab
Programming
C++.com
StackOverflow
Mechanics and Materials
MatWeb
MecMovies
Thermodynamics and Related
Cambridge Materials Science DB
Cambridge Materials Science Videos
Cal Poly Pomona ME Videos
ChemEng
LearnChemE Screencasts

Other Subreddits
r/AerospaceEngineering
r/AskElectronics
r/AskEngineers
r/CAD
r/CAM
r/CAE
r/ComputerScience
r/Engineering
r/EngineeringMemes (Memes can be found here)
r/EngineeringTechnology
r/ECE
r/LaTeX
r/MatLab
r/STEMdents
r/WomenEngineers
r/FE_Exam

/r/virtualproduction

4,056 Subscribers

3

we Made a free plugin for powering live visuals with music

1 Comment
2024/11/29
10:00 UTC

9

Our AR test in small studio setup:)

0 Comments
2024/11/27
08:26 UTC

19

First images I’m proud of.. finally

It’s been almost 3 months since I’m building my VP studio, broadcast oriented. Choices that were made are aximmetry, vive mars (+fiztrack), vmix and we’ve just received a JibPlus from Edelkrone for this nice smooth floating jib movement.

It’s a pretty hard journey, and learning how to use Aximmetry is a daily struggle. Some days are like a 100% dedicated fight with your gear, softs, video signals, network… And some, planets aligned and its a kind of magic.

Zoom movement in the end needs to be improved, we have some stuttering on the focus motor. Almost nothing but the fiztrack looses some data, enough to be noticed on screen. Thats probably the most difficult part in VP, calibration and sync must be so so precise, or you just throw to the bin your recordings.

9 Comments
2024/11/27
00:14 UTC

5

nDisplay picture is strangely curved

Hello, I'm unsure how to word this but I will try my best.

For some reason, on my studio's nDisplay, the picture seems to be at a weird angle and give it a curved appearance? Like it distorts the environment strangely?

It happens in every environment.

If I have the frustum activated (using Mosys for camera tracking) it fixes the issue and shows the correct view, keeping objects straight etc.

BTW, we already re-made the wall mesh in Blender to make sure it was accurate and it is :/

it distorts the wall and closet doors to curve downwards (box to block my coworker who was standing in)

how the actual environment looks, all walls horizontally straight

again, this temple in the background is completely straight YET distorts on ndisplay

ndisplay config window, you can see it really clearly here, as I'm sure most of you know that building on the right side is completely straight yet gets strangely angled and curved while on ndisplay

14 Comments
2024/11/21
16:56 UTC

1

Environment lighting

When lighting in your environments for LED walls do you use real world lighting levels (example 1700 linens for a 100 watt lightbulb), or do you just use what looks good while building the environment before putting it on the wall?

1 Comment
2024/11/21
03:37 UTC

6

Help needed with strange phenomenon (Info below)

Hi everyone, 

We are presently testing new possible LED panels for our studio and we have encountered this really strange phenomenon. Some details:

  • It’s always roughly horizontal, no matter if I twist the camera horizontally off axis or tilt the LED wall-piece
  • When genlocked, the stripe sits still, and moves accordingly with vertical camera movement (I pushed it slightly off sync here on purpose because the problem is easier to show on video)
  • In the reflection, the stripe moves on (!), not as a reflection, but something else. This only concerns light that has been projected by the LED, and not ambient light in the studio which is mind breaking for me.
  • We use a rolling shutter cam. We tried with a global shutter rental camera, in that case the stripe sits still on the wall and is not anymore camera movement dependent, but is still (albeit less) visible.
  • Camera is presently at 59.94, 360° shutter angle (but we have tried in other combinations, stripe gets broader with 180°)
  • We tried with many different cameras, synced via genlock and unsynced, and have always found something like that
  • It is independent of content and content source, we even tried two different LED processors
  • It shows up in darker / dim picture areas the most.
  • I blew up the footage a bit in post to make it more visible. Of course this is not visible to the naked eye.

Any ideas what we are witnessing here, and maybe also how we can tackle it?

Thanks everyone!! Appreciated.

https://reddit.com/link/1gvowfd/video/ue48qqrd022e1/player

20 Comments
2024/11/20
12:58 UTC

3

5.5 Integration

Has anyone been able to get 5.5 running on their stage? I have tried to get it working but it keeps crashing or freezing.

6 Comments
2024/11/18
22:22 UTC

8

Tyson vs Paul graphics

Hi

just watched some bits from the Mike Tyson vs Jake Paul boxing fight and curiocity has gotten the best of me. Does anyone have any knowledge on who or what company did the virtual graphics for the program. Or even better some software / hardware specs, I´m assuming at least one spider cam and probably some other assortment of Stype stuff.

1 Comment
2024/11/17
22:43 UTC

5

Jetset (Dynamic LiDAR Scaling Fix?)

In following along with the “Autoshot Unreal Round Trip” tutorial, I’m attempting to replicate this specific step-in-the-process (but within Unity) beginning at this timestamp: https://youtu.be/XK_FpXXBU7w?si=rUbzpH_ERUbTq-By&t=2299

My Jetset track is solid. My recorded Jetset comp is solid. I seem to be facing the same problem demonstrated within the video caused by the inaccuracies of the iPhone LiDAR system.

In my efforts to replicate the outlined solution (from within Unity) as provided by the video for Unreal - I’m not achieving a similar result.

Attached are two screengrabs: https://imgur.com/a/DlyCQUh

  1. The in-iPhone Jetset comp .
  2. The Unity Editor with Animation Window & Hierarchy Window open.

Replication Notes:I’ve tried deleting keyframes and manually entering a position for the Image Plane gameObject’s Z-axis Position (Unity Equivalent). I’ve also tried deleting keyframes and manually entering a scale for Image Plane gameObject’s Z-Scale. Neither approach succeeds in replicating the process outlined in the linked video tutorial.

My three questions:

  1. Which gameObject animation transform properties should be deleted?
  2. Which gameObject should have its X-Position location altered?
  3. What might be the correct workflow for getting the image plane track to perform in Unity as it does within Jetset?
2 Comments
2024/11/17
20:47 UTC

6

How would you even get started with virtual production?

Besides building a studio, do you just need to get a foot in the door with a working team?

Doesn't seem like there's an easy inroad besides spec work.

14 Comments
2024/11/17
15:53 UTC

6

For Hire - Unreal Artists / Technicians for VP

Looking for an artist to assist with our unreal scenes. We have modeled majority of geometry in 3dsmax (mainly architectural imagery like exteriors, boardwalk, retail, office interior) Tasks would be to translate finished 3dsmax files, replace any assets (plants), redo lighting etc.. Looking for help next week. I can send brief if you have a portfolio via email. Can sort pricing after.

Freelance, remote work per scene.

6 Comments
2024/11/15
20:34 UTC

4

Techviz for commercial productions

How common is proper techviz implemented in virtual production for commercial productions? Is it typically offered but omitted due to budget/timeline constraints?

5 Comments
2024/11/14
17:29 UTC

3

Filmmaking software questionnaire

Hello everyone! 

I’m a final-year filmmaking student, and I’m currently writing a dissertation on how advancements in technology and software have made advanced filmmaking more accessible. To get a range of personal insights, I’ve created a short questionnaire on how these tools have impacted people’s careers. If this topic resonates with you, I’d be grateful if you could take a few minutes to share your thoughts: https://forms.office.com/e/2t5LSGrZyt

Thank you for helping with my research!

0 Comments
2024/11/14
12:57 UTC

5

Questions on Render Node

We're a small VP studio with a 30'x12' LED wall. We are trying to ensure our render node is running the best it can. We've had some questions come up over the last year as far as best practices go, specifically relating to performance. We have two a6000 cards in the machine but we'll often find levels run with unusable frame rates for icvfx until a level is really pared down to the bare bones. Is this to be expected?

Also just looking for ways to test and get benchmarks. We've sometimes wondered if we are indeed using both GPUs and using them in the most effective way. I haven't been able to find definitive answers on nvlink, SLI, multi-gpu etc. so just wondering if anyone can weigh in on the matter.

Specs:

AMD Ryzen Threadripper Pro 5995WX 2.7GHz 64 Core, 256 GB RAM, 2x NVIDIA RTX A6000 48GB

11 Comments
2024/11/08
18:01 UTC

7

From Pre-Edit to Final Cut: Using Jetset for Efficient Filmmaking | Virt...

0 Comments
2024/11/08
02:15 UTC

1

Currently counting dead pixels on our film wall. Does anyone have any easy methods for this kind of thing?

Looking for resources like

-Pixel counting ruler

-pngs for seeing dead pixels

-anything else

5 Comments
2024/11/05
15:41 UTC

2

switchboard/ndisplay animation sequence playback issue

2 Comments
2024/11/05
15:06 UTC

3

Transferring Solved Lens Data from 3DEqualzier to Unreal's Lens File within UE 5.4/5.5?

Hi everyone,

I'm not 100% sure if my question covers the standard virtual production method/workflow since my interest is specifically with only the Lens File and Lens Component setups, and not relying on using additional live-action plates or LED wall panels.

I've been wondering if anyone is familiar with the process of transferring raw static and/or dynamic solved lens data that's from 3DEqualizer into Unreal's Lens File setup? There's very little information I've found about this topic online since it's not a real-time live link workflow directly within Unreal.

The goal I have in mind is to investigate what distortion parameters are transferable; Especially if the data is recorded across each frame for an image sequence. Whether that can cover lenses that are dynamically animating over time due to a change in focus pull, focal length, as well as lens breathing and/or re-racks if using anamoprhic lenses.

4 Comments
2024/11/02
08:52 UTC

2

nDisplay Light cards

Can someone please explain what these are used for?

3 Comments
2024/10/31
19:23 UTC

10

Basic training on Virtual Production

Hi folks, I'm looking for any free trainings, videos and documentation that give a broad overview of how a Virtual Production studio "works". Basics like genlock, video processors, LED arrays, etc and how they all work together is what I am looking for. I've been watching YouTube videos trying to learn what I can but wondering if anyone has any recommendations? Is there anything that covers the basics, VPS-101 type of thing?

A little background, my company's marketing department is setting up a VPS and my team (internal IT/AV) will be supporting them from time-to-time. I'd like me and my guys to learn some of the basics so we are all on the same page when we help out. Basics on motion tracking systems (mo-sys), how the signal flows from camera-unreal-video wall, how video processors (Brompton) work, etc. I'm not expecting us to walk away from watching some videos to be experts, but I want us to have a good feel for the process.

I would also like some of the managers and directors to go through these trainings so they have a better understanding of how this whole process works.

19 Comments
2024/10/31
15:44 UTC

1

Erklärung & Eigenes Konzept eines Virtual Production Studios

0 Comments
2024/10/31
13:19 UTC

2

Is this the correct way to set up Genlock settings for an LED Wall?

https://preview.redd.it/m3meargsm0yd1.png?width=3972&format=png&auto=webp&s=cc669544c1e9099d3242a5c4aa31bc608f735933

This image shows our current studio setup.

One PC is connected to an LED processor via HDMI, and the background is displayed on the LED WALL through N-Display.

We are currently using a GH4 as our test camera.

https://preview.redd.it/tizxdogsm0yd1.png?width=6546&format=png&auto=webp&s=dbd6cb0db86a0c6829425c6f17dcb8be3e892586

And this is the Genlock configuration diagram that I've studied and put together.

Is this the correct way to configure Genlock for an LED Wall

And I have another question.

Is a Quadro graphics card absolutely necessary for Genlock between the camera and LED WALL?

I understand that Quadro is needed when running N-Display with multiple computers.

However, since our studio runs N-Display with just one computer, we determined that we don't need a Quadro and built our computer with an RTX 4090 instead (Quadro is also too expensive).

11 Comments
2024/10/31
04:17 UTC

8

What specifications should a camera have for virtual production shooting with LED Walls?

I am preparing to open a virtual production studio in Korea.

https://preview.redd.it/ksw42tbhhvxd1.jpg?width=966&format=pjpg&auto=webp&s=e1d644814affaf5cbb8caa26be37520d24ac33d8

https://preview.redd.it/em9pdqsjhvxd1.jpg?width=4000&format=pjpg&auto=webp&s=0c1bb17a397c4e4498d31e542aa9d680d770373c

We are currently testing with a Panasonic GH4 camera, and the results are absolutely terrible.

The footage is so bad that we can't even tell if we're doing things correctly.

When we get even slightly closer to the wall, there's severe moiré, the colors look strange, and overall it's just terrible.

However, when some clients came to our studio and shot with Sony cameras, the results were decent (though this was shooting 2D video played on the LED wall, not Unreal Engine content).

Therefore, we feel it's urgent to establish what the standard specifications should be for cameras suitable for virtual production.

I don't think it's possible to get detailed camera recommendations from this Reddit post.

I would be grateful even if you could just give me a rough estimate of what level of camera would be suitable.

24 Comments
2024/10/30
10:53 UTC

3

Rendering out the outer frustum in HQ possible?

Hey there. In order to create a few "simpler" setups on our LED wall, we've been doing some UE 5.4 renders to put on the wall instead of doing live-tracking. (This of course means a fixed view without parallax and that's fine for this purpose) Is it possible instead of rendering one specific cinecam to render the (curved) LED Wall projection that's used for the outer frustum instead? Meaning, in that high quality that the movie render queue allows. That would probably work better in terms of more accurate display of the world...

Thanks for any advice!

4 Comments
2024/10/30
09:46 UTC

6

Is there still a need for aximmetry?

This might be not the smartest question but I'm serious here.

I've set up a virtual production with a green screen room. I'm using the vive mars setup, the BMD ultimatte 4k, and an otherwise all in UE5.4 setup which gets me all the way to a final composition over adi outs to the preview screen and I record takes to render out with path tracing afterwards.

What exactly does aximmetry do to lighten/ ease up the load? I see that it manages Hardware and tracking, can load scenes and key the green out, but is it still beneficial enough currently to pay the hefty price for it?

We're currently looking to optimize our studio to be more reliable although we are already in a pretty good spot, we get 50fps with scenes that are all Megascans and have foreground elements in front of the recorded person in the Greenscreen too.

I'm genuinely asking this because I can't find anything about aximmetry use for VP that's less than 2 years old. Two years ago the UE was wildly different when it comes to VP...

7 Comments
2024/10/29
10:04 UTC

18

We create any environnement for your Virtual Production need

Hello.

As the title says, we offer this service worldwide.

We are based in France and we have teams so we can scale and deliver pretty much anything remotely.

This allows us to collaborate with studios outside of France.

Quality is always photorealistic but how much really dépendent of your needs. We recreated the Eiffel tower, our dataset, but we also can give you a soccer field or the moon.

The video is a BTS of this clip where we delivered 6 environnements in 5 days : https://youtu.be/YDBIxhq6pH4

Since then, its been wonderfull time and hapiness mixed together. The last two were 4 environnements in ~48h optimisation included and 2 environnements (pretty complexe) in 72 hrs.

We can definitely deliver any standard but please let us more time if you guys call us, results will always be better.

8 Comments
2024/10/26
23:55 UTC

2

The Human Race - Short Film (2024) An in-depth profile into Astral Prime spaceship racing during the mid-21st century, as political and economic tension on Earth put the future of humanity at risk.

2 Comments
2024/10/25
23:42 UTC

Back To Top