/r/computergraphics

Photograph via snooOG
Welcome to r/computergraphics.

This subreddit is open to discussion of anything related to computer graphics or digital art.

Computer Graphics FAQ

Submission Guidelines

Anything related to CG is welcome.

  • If you submit your own work, please be ready to receive critiques.

  • If you submit other peoples work, please give credit.

For more information about posting to our subreddit, read the r/computergraphics FAQ

Technical Questions?

Here are subreddits dedicated to specific fields:

r/vfx
r/GraphicsProgramming
r/MotionDesign
r/Programming
r/gamedev
r/Low_Poly
r/archviz
r/3Dmodeling
r/DigitalArtTutorials

Software Questions?

Questions about specific software. We'd love to help, and please feel free to ask, but these software specific communities may be able to provide you with more in-depth answers.

r/Maya
r/3dsMax
r/Cinema 4D
r/Blender

/r/computergraphics

53,501 Subscribers

14

4D Gaussian pipeline

This is the method proposed by Real-time Photorealistic Dynamic Scene Representation and Rendering with 4D Gaussian Splatting @iclr 2024

Just wondering the process of turning 4D into 3D Gaussian. Is it using a function of time to determine the 3D Gaussian at any given instant and do the normal splatting? (I didn’t quite get the paper).

In the paper it mentioned this method is not affected by obstructing views so just wondering why that is?

Many thanks!

0 Comments
2024/04/25
21:42 UTC

1

So, how do you actually texture procedurally generated terrain?

While Googling around for how to apply different kind of textures in the context of procedural terrain generation, I can't but help to notice that a lot of people "know but don't know" what they are talking about. I often encounter people saying "just do X" but seemingly never follow up with how to actually do stuff. For a beginner that might not know much, it's quite annoying to never find more granular answers. I am, of course, not expecting a step by step "hold-your-hand" style tutorial walking through every single detail, but at least give a more fine-grained guideline than "just do X".


With that said...

Assume we generate a coarse non-tessellated mesh CPU side, run it through a tessellation shader to obtain a more fine-grained mesh along with texture coordinates and normals for each new vertex. Furthermore, assume we have in our fragment shader

  • A normalized height value ([0, 1], could also be in [-h, +h] if you're defining stuff in your own way),

  • A texture coordinate,

  • and 4 textures corresponding to "rock", "sand", "grass" and "snow".

Let's try to answer a question or two with more detail than your typical "coarse" answer found on the internet. It's likely safe to assume that most people here have used GLSL before, so feel free to contribute with concrete GLSL code as a way of concretizing your ideas. It often helps to have something more "concrete" on top of provided pure theory.


How should one choose the values for the boundaries between the different textures? Does there exist some somewhat robust way of automatically generating ranges, or is this part simply about manually fine-tuning?

The most straight-forward way to do this seems to simply manually tweak ranges until you get something that seems to fit nicely for your particular generated terrain.

When making significant modifications to the terrain, e.g. via tweaking parameters for how your heightmap is generate (e.g. Perlin noise parameters), then you have to re-calibrate the ranges which is annoying but the downside of this strategy. What other strategies are there?


How do we properly texture the terrain?

One naïve way of doing it in is to simply use the defined boundaries to choose what texture to sample and simply use the sample as the color. However, this will typically not yield nice results as it'll yield seams between boundaries. For example,

vec3 color = vec3(0.0f);
float t = u_Height / u_HeightScale; // [0, MaxHeight] -> [0, 1]
if (t < 0.10f)
    // Water
    color = vec3(66.0f/255.0f, 135.0f/255.0f, 245.0f/255.0f);
else if (t < 0.25f)
    // Sand
    color = texture(sandTexture, texCoord).rgb;
else if (t < 0.50f)
    // Grass
    color = texture(grassTexture, texCoord).rgb;
else if (t < 0.75f)
    // Rock
    color = texture(rocKTexture, texCoord).rgb;
else
    // Snow
    color = texture(snowTexture, texCoord).rgb;
FragColor = vec4(color, 1.0);

To combat this people often suggest blending textures, however how this can be done is unknown to me. I am assuming one has to do more calculations that are dependent on neighbouring vertices, e.g. their heights or something. What is a concrete way of doing this in?

1 Comment
2024/04/25
16:19 UTC

6

Making an audio-visual module in Electron.js & Ableton

1 Comment
2024/04/24
17:40 UTC

48

I made a procedural planet with 3D cities for Blender Cycles (full video link in comments)

5 Comments
2024/04/24
14:02 UTC

5

Breath

0 Comments
2024/04/24
11:59 UTC

1

Algorithm for Even Distribution of Overlapping Bounding Boxes in JavaScript

Hey there! I'm a part-time JavaScript programmer. I'm looking for an algorithm to evenly distribute bounding boxes that may overlap in different kinds and should keep their position as close as possible but still be distributed in a consistent manner. By consistent, I mean the distances in between the individual boxes. They should not be aligned to a grid but keep a consistent distance inbetween each other. I'll attached a sketch of what I mean.

Two objects / bounding boxes may overlap partially or even completely. So there may be some bounding boxes with the exact same size & position that then need to be moved apart from each other, so that they are next to each other. I guess I need a recursive algorithm or something like that, since I want each bounding box to "try" to keep their original position as close as possible, while still approaching the even spacing.

Is there any kind of algorithm that already does exactly something like this? Or is there any way you can guide me to solve this problem? How complex is it really? Visually it is an easy task, but I've tried to code it and it doesn't seem that simple at all.

I need to implement it in JS, if possible without any complex external libraries etc.

Thanks a lot for your help!

Link to the sketch:
https://i.ibb.co/fYpyqpk/eualize-Boundingbox-Distribution.png

2 Comments
2024/04/22
23:19 UTC

7

day/night cycle inside my custom 3D engine

0 Comments
2024/04/22
17:48 UTC

2

Vertex Buffer Objects tutorial

0 Comments
2024/04/22
08:22 UTC

0

🍋🍋🍋 What the lemons day?!

0 Comments
2024/04/21
17:55 UTC

11

Stylized Sakuragi Hanamichi character model and animation. Any feedback appreciated :)

5 Comments
2024/04/19
14:45 UTC

5

Current graphics programming schedule

I have 3 subjects to study:

  1. C++
  2. 3D Math
  3. OpenGL

And I work as a freelance developer - bit of blender and three.js

So far I've been doing this:

  1. C++: 4hrs
  2. Freelance: 4hrs
  3. Part time job 2 days a week

But my math is very weak so thinking I should spend 1hr on algebra a day so I can speed up my math when I get to 3D Math.

Question is, is there a better way to plan out my day? Or keep it to what I have, one subject at a time?

Thank you.

4 Comments
2024/04/19
12:09 UTC

14

Hey there! Here's a brand new piece for the animated series \\WAVES, a project where a looping artwork is dedicated each time to a different song or playlist! 📀

5 Comments
2024/04/19
10:32 UTC

10

OpenGL Procedural Terrain - improved placement of snow

4 Comments
2024/04/18
17:01 UTC

4

Just Come Home

0 Comments
2024/04/18
15:10 UTC

1

Made the project as a template and am giving it away for free to anyone who would find it useful. Enjoy!

0 Comments
2024/04/18
01:59 UTC

5

How does this wormhole effect shader make the distortion?

Below is the code for a simple wormhole effect shader made in shadertoy.com.

I understand everything else except:

uv.x = time + .1/(r);
uv.y = time + sin(time/2.575) + 3.0*a/3.1416;

Here's what I think it's doing:

  • `uv.x = time + .1/r` is making the later `texture()` take the color for the pixel from the right side of the wormhole's texture, making it look like it's moving forward. And I think the `+.1/r` is there to make the center of the wormhole to distort more.

  • I have no idea what the `uv.y` is doing.

    void mainImage( out vec4 fragColor, in vec2 fragCoord ) { float time = iTime; vec2 p = -1.0 + 2.0 * fragCoord.xy / iResolution.xy; vec2 uv; p.x += 0.5*sin(time);

    p.y += 0.5*cos(time*1.64); float r = length(p); float a = atan(p.y,p.x); uv.x = time + .1/(r); uv.y = time + sin(time/2.575) + 3.0a/3.1416; float w = rr; vec3 col = texture(iChannel0,uv).xyz; fragColor = vec4(col,1.0); }

2 Comments
2024/04/17
19:12 UTC

3

Demand for 10-100 billion particles/voxels fluid simulation on single work station ?

0 Comments
2024/04/17
07:45 UTC

3

Backrooms animation

1 Comment
2024/04/16
20:55 UTC

1

Join PixelMentor Discord Server

📷 Exciting News! Join Our Vibrant CG & VFX Community on Discord! 📷

https://discord.gg/TRrw4ZZaXt

Are you passionate about Computer Graphics & Visual Effects? 📷 Look no further! Our Discord channel is the ultimate hub for professionals, enthusiasts, and newcomers alike! 📷

📷 Dive into a world of:
- Latest News: Stay updated with the hottest trends and breakthroughs in CG & VFX.
- Creative Challenges: Fuel your creativity with stimulating challenges and competitions.
- In-depth Tutorials: Learn from the best with tutorials covering a wide range of topics.
- Top Tools & Resources: Discover essential tools and resources to level up your skills.
- Networking Opportunities: Connect with industry professionals and fellow enthusiasts for collaborations and discussions.

Don't miss out on the opportunity to be a part of our dynamic community! Click the link below to join now and unlock a world of endless possibilities in CG & VFX! 📷📷 hashtag#CG hashtag#VFX hashtag#DiscordCommunity hashtag#JoinUs

Join Now!!!
https://discord.gg/TRrw4ZZaXt

https://preview.redd.it/670hlh4qspuc1.jpg?width=4096&format=pjpg&auto=webp&s=f63bdd0706695c4df7c949118b5197279a00a31d

0 Comments
2024/04/15
21:54 UTC

10

Trying to be healthy

1 Comment
2024/04/15
19:26 UTC

11

RoPes

0 Comments
2024/04/15
16:19 UTC

4

Why we use oval shape as a single unit in the point cloud in 3D Gaussian splatting?

Maybe my understanding is incorrect but 3DGS is basically a point cloud formed by ovals that have variable colour depending on the view point.

Just wondering why oval is the preferred shape and not other shapes?

Is there a specific the technique/paper that indicates the process of finding the ideal shape for a single unit in a point cloud?

Many thanks!

6 Comments
2024/04/14
04:40 UTC

1

Suggestions for spiderweb interactive maps for teaching

Idk if yall have seen those presentations that literally are like spiderweb maps and you click different locations of it to look at that info. Can anyone help me pinpoint what to use to create it? Im trying to create a visual interactive diagram for different OS's and things for class. Any ideas?

0 Comments
2024/04/13
00:27 UTC

10

I Created the Walt Disney Castle after 5 YEARS

I challenged myself to recreate the Walt Disney castle after a 5-year hiatus. The new project was built using Blender 4.1, while the previous one was created with Blender 2.79.

I would love to hear your thoughts on the comparison between the two versions.

If you’re interested, you can watch the creation and animation here: https://youtu.be/l89sj4DiMNo

0 Comments
2024/04/12
20:10 UTC

9

LegoBat

0 Comments
2024/04/11
15:43 UTC

2

Setting shader array from external source (ILGPU, ComputeSharp) or create ComputeBuffer from pointer in Unity

I am writing an application that is using Unity as a visualizer, but as the core of this application has to be somewhat portable/disconnected from Unity I am using ILGPU (can change to ComputeSharp if need be, went with ILGPU as it supports more platforms) to write some performance critical code to take advantage of the GPU (advection/diffusion fluid engine). This all works fine and the code runs, but now I want to display the simulation on screen.

The naive way would be to do work on GPU through ILGPU, fetch the data to the CPU, send it over to Unity and then fill a ComputeBuffer with the data, then point to that buffer in the shader. This will be slow.

So my question is, is it possible to directly set the ComputeBuffer in Unity using the pointer from ILGPU or some other magic on the shader end?

0 Comments
2024/04/10
18:46 UTC

6

A380 Vs Ion Man

2 Comments
2024/04/09
19:27 UTC

9

Farm

0 Comments
2024/04/08
14:15 UTC

13

Do you talk to your dog?

0 Comments
2024/04/07
16:01 UTC

Back To Top