/r/ARKitCreators
A place to showcase your ARKit demos/tests/products, get help with creating and developing or just to hang around!
The purpose of this sub is to provide a place for creators, developers or whoever is interested in Apple's ARKit framework.
Share your demos, tests and even final products. Help and get help with creating.
Enjoy seeing what others create and maybe get the chance to be the first to have your hands on it!
Related Subreddits:
External Links:
Flairs by: Pixel Buddha, Nikita Golubev, Freepik, Madebyoliver
/r/ARKitCreators
I am trying to get the focal length in milimeters from the ARFrame camera but can’t seem to manage.
I found a stack exchange post about computing the field of view in degrees which gives me a number that changed slightly every frame which I expect as the camera probably adjusts focus.
Howrver the field of view in degrees is not so useful to me, I need the focal Length. Which the sceneview camera provides but that is a static number and when I try to match that in a 3D application it does not seem to match. I assume because the ARframe is cropped somehow.
Does anyone have experience with this?
Hi! I'm a Product Manager at Tetavi - a tech startup. Currently, we are working on an app that enables creators such as you to create unique content based on elements from the world of 3D.
What can you do right now?
👯 Create beautiful volumetric 3D captures
🎥 Seamlessly turn 2D videos into 3D moments
🤯 Edit perspective, rotate your model or play with scale
🎨 Add immersive environments or full-body effects
***Any content is shareable on Tik Tok, Instagram, Twitter, etc.
If you're interested, we'd love for you to check it out register on our waiting list and I will send you a personal invite code. https://53kwcbn1zwo.typeform.com/to/gtoOJOuZ
Just shout if you have any questions and have a nice day!
Hi! I'm a Product Manager at Tetavi - a tech startup. Currently, we are working on an app that enables creators such as you to create unique content based on elements from the world of 3D.
What can you do right now?
👯 Create beautiful volumetric 3D captures
🎥 Seamlessly turn 2D videos into 3D moments
🤯 Edit perspective, rotate your model or play with scale
🎨 Add immersive environments or full-body effects
***Any content is shareable on Tik Tok, Instagram, Twitter, etc.
If you're interested, we'd love for you to check it out register on our waiting list and I will send you a personal invite code. https://53kwcbn1zwo.typeform.com/to/gtoOJOuZ
Just shout if you have any questions and have a nice day!
Hey
We are looking for AR/VR enthusiasts who are interested in exploring an app that allows you to create in AR and VR environments. In addition, you can insert real human captures, including yourself.
As stated in the question above, I would like to know if it is possible to capture the content of a view and then place this content in the real world via AR? Can someone help or got any other solutions? Thanks a lot in advance
This is a somewhat elementary question - sorry 'bout that. Do I understand correctly that ARKit rayCast subsumes all the function previously provided by hitTest and that is why hitTest is deprecated? Is
rayCast(from: xxx, allowing: .existingPlane, alignment: .any)
equivalent to
hittest(xxx, types.existingPlaneUsingGeometry)?
I am keenly interested in making AR/VR applications, but couldn't find any active communities. So I was thinking of making one on my own, if you are interested, please do check it out discord link.
ps:- comment/dm if you are already a part of any active communities
Hi, I'm not seeing environmental shadows (my Realitkit object casting a shadow from ambient light) if the device doesn't have LiDAR. Do I have that right? Is this a limitation of non-LiDAR or am I missing something? Just to be clear what I'm asking: if I make a cube and place it on a table, I'd like to see a shadow around its base. The only options I see for doing this relate to sceneunderstanding which is a LiDAR-only thing.
Hi, can anyone suggest Discord communities for ARKit/RealityKit developers?
Thanks!
Hi everyone, I have encountered this problem in Scene kit and I can't find a solution, can someone help me? Thx in advance for your support!
[SceneKit] Warning: Mesh element 0x282f9d3b0 of mesh 0x28299db20 has 3 channels but they all define the same topology
I'm developing (non-commercially, spare time) AR apps mainly for data representation and hope that Lidar is going to make hand interaction possible/easier.
I've been waiting patiently for the Apple headset to come out.... but I'm loosing patience now and am consider buying at least the iPad Pro mini to enable me to make some progress here....
Any thoughts or wise words of wisdom?
3D model textures aren't showing up
I have some USDZ, DAE textured 3D models. When I import them into Xcode the textures are missing. I tried to change the texture from jpg, to png to Tga. Unfortunately it didn’t work, and when I run the app on my device the materials show up as white or kinda color ID. Does anybody have any suggestions on how to fix this? Thx in advance for your support!
Just wanted to pop this here in case anyone was interested here! It's a webinar on how you can create your own Augmented Reality Wayfinding (amongst other things).
Would love to get people's thoughts on how we are doing this from this community - so appreciate anyone watches it live or the recording after who has an opinion!
Webinar link here or DM me if you want to test it out (be aware you will need a scan of the space you want to test in - for our accuracy, we rely on these).
I'm trying to sneak a video demo from the developers beforehand to post in here - so stay tuned!
The developer isn't sure, and some basic googling doesn't help.
It's using face tracking if that helps.
I would think there would be a simple toggle like "force z depth pass" or something.
Poplar Studio connects creators with top brands to help realise their AR and 3D campaigns. Become a creator today to begin applying to briefs from some of the biggest names in the business.
How it works:
Once you’ve signed up, expand your network of AR and 3D creators by joining our Slack group and build your knowledge through expert-led webinars. We also offer a huge range of certifications including SparkAR, Lens Studio, 8th Wall, 3D modelling and more which help you stand out from the crowd!
Sign up today at https://poplar.studio/creators/ to become a certified Poplar Studio creator!
Hi,
I am an entrepreneur in the construction industry and passionate about all types of technology.
I would like to develop an application for Iphone 12 PRO / Ipad Pro using LIDAR technology to scan rooms and take measurements of floor, wall and ceiling surfaces.
Do you know of any tutorials for doing tests?
Thanks in advance.
Not sure if this is the place to ask this but I’ve seen a bunch of concepts (here as well) about turning real life objects into 3D using LiDAR and such, but I have yet actually found an app that could do that?
Hi everyone! Recently I released a body-interactive music app called "Affine Tuning". It uses RealityKits Motion Capture feature to detect movements and react to it musically.
I know that this is technically not a proper AR project, as it does not augment the video image in any way, but I hope it still might be interesting as it adds an artistic sonic layer to the input (body movement).
The app itself is basically a collection of dynamic, interactive and compositions. It is not a game or instrument, more an experimental experience. Right now there are three pieces in the app and I want to add more music and features soon. I am also thinking about soundscapes and similar feedback that is not strictly "music".
Here is a trailer: https://www.youtube.com/watch?v=DLhn-0kDF_c
Here is a video which shows the Motion Capture input: https://www.youtube.com/watch?v=vPlUoxunQpA
And here the download: https://apps.apple.com/app/id1515435997
The app is completely free and an artistic endeavour rather than a commercial one. I am simply happy about every person who tries it out. If you want to share your experience, have comments or ideas, I am really looking forward to your input!
I've got a quick question regarding ARKits scene reconstruction. Is it possible to get the world coordinates for the faces/vertices that are part of the generated mesh or to select them individually?
After looking through the documentation at apple and tinkering with the example apps it does not seem possible working with the faces property of ARMeshGeometry, but the vertices property does return coordinates. Here's apples code-snippet on how to select specific vertices:
extension ARMeshGeometry {
func vertex(at index: UInt32) -> SIMD3<Float> {
assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.")
let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index)))
let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee
return vertex
}
}
I've tried to place objects at those coordinates to see what they refer to, but they somehow end up in the middle of the room, far away from the mesh.. leaving me a bit confused as to what the vertices coordinates actually refer to.
I'd appreciate any answers on how to approach this!