/r/SynthEyes
SynthEyes™ is a standalone application optimized for camera, object, geometry, and planar tracking, stabilization, and motion capture, with high performance and a huge feature list at an affordable price. Use SynthEyes for critter insertion, fixing shaky shots, virtual sets, stereoscopic production, 360°VR, architectural previews, accident reconstruction, product placement, face and body capture...
Unofficial subreddit for SynthEyes users that want to find and share tips about the software and match moving techniques.
/r/SynthEyes
I'm looking for a bit of advice here about anamorphic footage tracking.
I've undistorted my footage using a distortion board (solved in Syntheyes) in AE with RE:Map UV, it's looking nicely undistorted and correct when Pixel Aspect Ratio correction is switched on in the comp window.
Should I bring the undistorted footage into Syntheyes for tracking with a 1.5 pixel aspect ratio (from my undistorted comp that is 6032x3534, from footage that is originally 5120x3000), or should I render it out with a square pixel aspect ratio (in a comp that is 9048x3534) for tracking?
Then, how should this be read when loading into Syntheyes?
The original footage details are below:
Image Aspect:2.56 : 1
Pixel Aspect: 1.5
Anamorphic Squeeze: 1.5x
Sensor's Aspect: 1.7 : 1
Camera sensor: 30.72 x 18.00
I can't seem to find any clear tutorials online that show a good workflow. Any assistance would be greatly appreciated! The tracking data/camera will be going through to Maya & Vray.
trackers jumping after alter image processor
The problem is when i change the colour contrast or gamma in ( Colour processor ) the tracked trakers loosing their track and jumping everywhere after i change colour like ((gamma, contrast and curve tool )) i don't know why its happening too please help me with that
Hi guys i try to track a long movement scene , with have bit complex movement but i done well superviced manual track , but when i hit solve button it not even solve the camera , it just shows that camera not solving in few seconds in 10 or more frames , please anyone help me , #syntheyes #error
Hi
So i want to track the head of a person that have marks on it. I have 3 camera sources for that, how can i track those marks from the 3 cameras at the same time in Syntheyes?
Thanks.
Hello guys,
I’m new to SynthEyes and trying to learn it from youtube. I ve been tracking a plate that I filmed myself, and I m trying to use the distorsion grid that I got from the shoot too, but I don’t understznd how to do it. Any idea please ?
Is someone knows how to solve this purpel error from tracked video of syntheyes to blender.?
I want to track a shot in syntheyesed, whis is captured portrait in angle, but the problem is that After tracking when I export with alembic(abc) file to blender footage is not matching with camera and trackers I need your help guys
Hey everyone,
I am new to syntheyes and I tracked a shot to ad some 3D later in Blender.
I got a good solving error of 0,3 but the focal length got set extremely low to 4mm. (Originally it was 40mm)
I saw a video of one of the devs explaining that the focal length will be always different in syntheyes but when I import everything to Blender my 3D is really distorted due to the 4mm.
Went something wrong with the track, or is there another way to fix it?
I already tried to set the focal length back to normal in blender but that just messes up the track obviously.
Hello Reddit,
I have a shot of a camera dollying and tilting down -- the kicker is that the only thing in frame is the ground. There are plenty of points to track, but syntheyes can't process that they are all on the same plane -- any ideas?
Thank you in advance!
Pikachu_time
I am trying to export a stabilization I did in syntheyes (PEG) to After Effects, either as a camera, as 2D transform keyframes, or as trackers that I can use with after effects tracker. Every type of export I do from Syntheyes comes up with no Keyframes in after effects
Hi!
Is it possible in Syntheyes to have geometry as reference? By reference I mean that the geo is not saved in the sni file but linked to the sni file. 3DEqualizer can do this and Maya can do this as well. This way the file size is kept small and there are no duplicates. Any help would be appreciated. Thanks.
Hey guys, anyone know how to properly import vertical video? When I import the file it automatically squashes it's aspect ratio.
Hi!
I need some help!
I have footage with zoom in it and I keep getting a high error. 14 hpix. I've set it with the zoom unknown and calculation distortion to zoom. Is there any way to get it down more? Should I add more tracks? Also I was provided with the a lens grid but I'm not sure how to add it to the shot. Any help would be great!
I don’t know why my entire scene is moving with the camera how do I change that?? On syntheyes
I was working on a school project and when I reopened the scene it looked like this? I don't know how to fix it and it's due today, the professors are ass at responding so any help would be lovely. I'm assuming I moved something in the files by accident or renamed something but idk what?
I have a tracked scene of 1837 trackers but it only exported 29 of them. Odd. The sni file says 29 exported but tracked 1837. Any ideas why? Thanks
I am working on green screen footage which has tracker marks and with syntheyes I am getting error of 0.135 (I have lens details and distance details)so I was curious is there a way I can get the error up to zero ?
Hi,
I want to import an .mxf file from camera into Syntheyes with .xml file.
Is that possible?
When I import the .mxf my footage is grey and only have 9 frames. I don't see an option to import .xml as well.
Hey guys,
I'm having some problems with orbiting the perspective view. I'm a Maya user, and the orbit doesn't behave "normally".
What is the center around which it orbits? It feels like ten miles away in the distance...
In Maya it orbits around the mouse pointer location, meaning you can orbit really accurately.
Thanks!
Hello I’m new to syntheyes. I’m currently trying to export a FBX but everytime I go to filmbox FBX and export my scene, it ends just exporting as an empty OBJ. What am i doing wrong? All setting are default.
A bit inexperienced here, so apologies in advance.
I have tracked about 6 complicated shots, only to have a client present new versions of them with extended heads and tails. I have tried using the add shot option to bring the new shots into Syntheyes. The result gives me a new untracked camera, but yet the old is still there, and it won’t account for new frames when doing a new solve.
Does anyone know a simple method to import the new shot, and only track the extended handles of the shot?
Hello! Im having a brainfart that does not go away. Im working on a plate shot in 4096 x 2160 on the black magic pocket cinema 6k camera and have a hard time trying to figur out what i should type in the sensor size field when you prep the footage... Iknow this may be simple af but my brain is done braining. Thank you in advance.
I’m working on a project and am wondering if SynthEyes has the ability to determine the actual position of the camera from which the footage is being shot? I’ve tried to search if this is possible, but all I’ve been finding are tutorials on how to camera solve. I’m new to the software so I’m not entirely sure if this is a feature. If so, how would one go about figuring this out?
Been trying to teach myself syntheyes, and have been hitting a roadblock in regards to online tutorials. I’m ultimately wondering if syntheyes has a feature comparable to 3DEqualizers ability to constrain a point to a vertice/line/face on geomoetry, and then rotate it into the correct lineup using that one constrained point. So far I’ve struggled to find something like this from my own digging, but I could just be getting lost in the UI.
If any Syntheyes users are around I could use some help. I've got a music video of entirely green screen footage and am currently going through them tracking the camera movement for use in AE and Blender. Even though we placed tracking points every 3' on the green screen wall yet Syntheyes's auto tracking always places them on the people (the most unreliable places for tracking camera movement), not the green screen tracking points. I've been just doing manual tracker placement to get around this.
Been reading and watching tutorials but every single one I've found tutorial does not discuss tracking shots with people in them. Only tracking shots comprised of architecture and objects which I've never found difficult even when using AE, Mocha and Blender's internal tracking tools.
I feel like I must be missing something obvious because I can't imagine there's not an easier way to get camera tracking on shots like this and not just unpopulated video footage of landscapes, buildings and the usual tutorial subject matter. If you have any insight into this I'd greatly appreciate it. I really thought filming with tracking data on the walls (with c-stands and lights for foreground tacking) would be helpful but I'm spending just as much time manually tracking this as I would have without those since the software doesn't seem to recognize them unless I specifically track them manually.
Lastly, I'm not ending to track planes or place objects into this. Literally just need the camera movement data.
Thanks so much for your time and help!
I have a footage in which I've zoomed in and out couple of time so is there a way to camera track a footage like that ?
While the smaller version loaded without issue, the original 3840x2160 version of the video caused SynthEys to freeze when loading.
Please help with advice!
Hi team,
A little help please. When I save my undistorted sequence from the Image Preprocessor, if outputs frames that are cropped on the right and bottom. They look fine in the viewport and preview window. Any help is appreciated!
J.