Return to “Technical”

Post

[Question] Procedural genration

#1
Hi,

As a developer I've become increasingly interested in creating my own procedural planets / universe. I have done a fair amount of research on the techniques involved and was wondering what techniques you use in LT to help point me in right direction.

Ive seen techniques using Surface Nets, Dual Contouring, warping cubes into spheres and so on. Because I'm not sure which is best to go with for procedural planets I have got myself in a confused state regarding implementing LOD (i assume using marching cubes / octree's).

Also not quite sure how to tell when imposters come into play. Also do you use compute shaders or are the meshes streamed to GPU.

I appreciate your time and hope you can give me some quick pointers. I really can't wait for LT to come to fruition. :D
Post

Re: [Question] Procedural genration

#3
I'm no developer (although I play around sometimes) but here's a few questions.

How close will people get to your planets?
Can you just throw an isosphere in and some pretty texture and call it a day?
Or do you need to be able to generate down to the meter for human level interaction?

If you only have to deal with space, start with a small star for the distance, colored based on whatever base color you choose for the planet. (procedurally or otherwise)

Once they get close enough that an imposter would be larger than that star, switch it out for a regular Icosahedron.
As it gets closer you can increase the number of triangles in it for a smoother appearance. Image (If you are smart you just do this to the front of the sphere)
(also this is a much easier algo to work with than the one linked above, you just need to divide each tri into 3. and set the vertex's to have a distance from the center of the planet of the radius of the planet)

From there you can generate a texture over that.
Each face needs to generate it's own texture based on it's current size, smaller faces generate higher quality images (so each face uses approximately the same number of pixels)
This is a little harder, but there are plenty of good fractal algorithms you can use for it.
Perlin noise is good too, if you use the same slice, and manage to get it wrapped around a sphere (or use 3D noise :V).

If you need to go down past that layer, use the same triangle based method, and once you are down to about 100km from the surface (of a accurate scale planet) start adding offsets to the vertices, based on the texture.

The entire process is pretty simple in theory.

Code: Select all

For each face
.. create texture of size ( 100 x 100 ) as faceimg
.. for each pixel in faceimg
.. .. xx,yy,zz as ( 3D position of pixel if on perfect sphere )
.. .. faceimage[x][y] = generate ( planet seed, xx, yy, zz )
.. map faceimage to the face

generate ( seed, x, y, z )
.. return selected noise at x, y, z
There is of course a little more work than that, but the basics of the job are there.
°˖◝(ಠ‸ಠ)◜˖°
WebGL Spaceships and Trails
<Cuisinart8> apparently without the demon driving him around Silver has the intelligence of a botched lobotomy patient ~ Mar 04 2020
console.log(`What's all ${this} ${Date.now()}`);
Post

Re: [Question] Procedural genration

#5
While I've never seen its source code, I'm fairly sure the wonderful Space Engine uses a simple 2D shaded circle for most space objects beyond a certain distance -- much faster and less memory-intensive than representing every visible thing as a 3D object.

It's only when you cross a certain distance boundary that LOD rules switch between a 2D circle and a 3D object. Most of the time, virtually everything you see can be rendered as a simple circle. That's how programs like Space Engine are able to display and smoothly rotate and translate through thousands of "stars" at a time: they aren't actually objects at all.
Post

Re: [Question] Procedural genration

#6
Thanks for all your feed back so far has helped me think of some aspects in different ways. I do plan to planetery landing. So was thinking if I created algo to create planet and using technique like surface nets I would be able to chunk parts of the planet based on what can be seen. But not sure how to chunk something like a surface net. Maybe some way to chunk a window of data to be voxeled. I'm sort of thinking it maybe a bit like the zoom into Mandelbrot technique but in 3 dimensions.
Post

Re: [Question] Procedural genration

#7
eq2k wrote:Thanks for all your feed back so far has helped me think of some aspects in different ways. I do plan to planetery landing. So was thinking if I created algo to create planet and using technique like surface nets I would be able to chunk parts of the planet based on what can be seen. But not sure how to chunk something like a surface net. Maybe some way to chunk a window of data to be voxeled. I'm sort of thinking it maybe a bit like the zoom into Mandelbrot technique but in 3 dimensions.
An idea is to use the triangles like I have up above.

You take each vertex, and offset it from the perfect radius from the center by noise.

Only you sample from 3D noise along the surface of a sphere where the vertex would be if it were a perfect sphere.

This gives you a 3D contiguous, spherical heightmap without edge distortion that would feature from generating it from 2D noise.

The Triangles give you optimal surface coverage, and let you control how the LOD works. (you only need to subdivide Triangles that are nearby)

Triangles are nice and easy to code, and the only complex part about this option is getting the noise to look nice, and then mapping a texture to the triangles.
°˖◝(ಠ‸ಠ)◜˖°
WebGL Spaceships and Trails
<Cuisinart8> apparently without the demon driving him around Silver has the intelligence of a botched lobotomy patient ~ Mar 04 2020
console.log(`What's all ${this} ${Date.now()}`);
Post

Re: [Question] Procedural genration

#9
eq2k wrote:Hi,

As a developer I've become increasingly interested in creating my own procedural planets / universe. I have done a fair amount of research on the techniques involved and was wondering what techniques you use in LT to help point me in right direction.

Ive seen techniques using Surface Nets, Dual Contouring, warping cubes into spheres and so on. Because I'm not sure which is best to go with for procedural planets I have got myself in a confused state regarding implementing LOD (i assume using marching cubes / octree's).

Also not quite sure how to tell when imposters come into play. Also do you use compute shaders or are the meshes streamed to GPU.

I appreciate your time and hope you can give me some quick pointers. I really can't wait for LT to come to fruition. :D
The techniques you mention are for turning 3D density functions into surfaces. This is basically a "powertool" that can be used to handle anything. LT, for example, used to use MC for asteroids and a few other things. Now it uses surface nets (better) and soon probably dual contouring.

But as for terrain, most game engines actually stick to the 2D / heightmap approach. Realize that a sphere can be created by 'normalizing' a box (i.e., take every point on the surface of a unit box spanning [-1, 1] in all dimensions, then normalize that point to have length 1) -- this leads us to a very nice trick: a planet can be seen as six individual planes (each plane is a side of the box) that have been 'normalized' into a sphere. Now we have a very nice 2D geometry for the planet.

Here comes the hard part. To make full-scale procedural planets, you need adaptive LOD. This is why you want to start with 2D and not 3D (dealing with the LOD issues from surfnets or DC is far harder than dealing with 2D LOD issues). There is really only one sensible choice: quadtree. Each of the six 'sides' of your planet becomes an adaptive quadtree, that gets dynamically tessellated based on how close the camera is to the given 'chunk'. From there it is a matter of defining a heightmap function that can be evaluated in 3D (anywhere over your planet sphere). Typically you then just displace the vertices of your quadtree by the heightmap amount in the direction of the normal vector ('outward'), which is particularly easy since it's a sphere, so the normal is just the normalized position of your vertex. Finally, you will need to deal with the issue of 'cracking,' which will occur when two quadtree nodes that are neighbors have been tessellated to different degrees, hence vertices will not line up at the edges of the nodes and you will see cracks in your terrain.

That's a super-high-level overview of what you need to build an adaptive, procedural planet. You can see it is already quite complex even if we just treat the planet as a surface and use heightmaps. It gets significantly more complex when we try to do a full-3D generation of that surface using voxel techniques (don't go there until you're ready!)

First step I recommend learning is how to do this all with just one, non-adaptive plane. Learn how to create a plane with NxN vertices. Learn how to create a procedural noise function, then displace those vertices by the output of the noise function. This is your super-basic procedural 'terrain.' Next step, make it adaptive. Here is where the real work lies. Finally, when you've got a single adaptive planar terrain working, it's pretty smooth sailing - create a box out of six of them, normalize (as discussed earlier) to get a sphere, and make whatever necessary adjustments (shouldn't be much) to glue it all together. Voila.
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: [Question] Procedural genration

#11
A related practical question concerns the "procedural" part: how much of the semi-randomly generated terrain/heightmap data must be stored forever once the player has gotten close enough to a planet's surface for that data to be generated?

Storing everything once it's been generated insures that a place, once visited, always has the same apparent geometry on subsequent visits, but at an ever-increasing storage cost.

Alternately, randomly generating data on every new visit reduces storage costs, but the details of terrain will be different each time.

So how should developers decide which of these is best for their project?

To put it another way, how much terrain geometry/texture data must be stored to persuade most players that a place looks "the same" on each visit?
Post

Re: [Question] Procedural genration

#12
How about storing only the unique identifier for the planet? Combine that with the universe seed, and you get a new seed only used for generating the random stuff (size, type, whatever) for that planet.

When you visit the planet next time, the same seed need be used, so that the same random sequence is generated.
old-fashioned :ghost:
Post

Re: [Question] Procedural genration

#13
Flatfingers wrote:A related practical question concerns the "procedural" part: how much of the semi-randomly generated terrain/heightmap data must be stored forever once the player has gotten close enough to a planet's surface for that data to be generated?

Storing everything once it's been generated insures that a place, once visited, always has the same apparent geometry on subsequent visits, but at an ever-increasing storage cost.

Alternately, randomly generating data on every new visit reduces storage costs, but the details of terrain will be different each time.

So how should developers decide which of these is best for their project?

To put it another way, how much terrain geometry/texture data must be stored to persuade most players that a place looks "the same" on each visit?

if that happens, you do your PCG wrong :P

its as pillevoid said, its all based on the seed, just like minecraft maps with the same seed look the same every time, planets will always look the same when regenerated.
assuming the algorithm doesnt change inbetween of course.
Post

Re: [Question] Procedural genration

#15
It's worth storing planet position, and starbase and ship related stuff, but the actual planet meshs and textures are pointless to store.
Maybe caching on disk is a good idea, but anything procedurally generated (so long as it's quick and efficient) isnt worth keeping unless it can change.
°˖◝(ಠ‸ಠ)◜˖°
WebGL Spaceships and Trails
<Cuisinart8> apparently without the demon driving him around Silver has the intelligence of a botched lobotomy patient ~ Mar 04 2020
console.log(`What's all ${this} ${Date.now()}`);

Online Now

Users browsing this forum: No registered users and 5 guests

cron