generation of normal and tangent for each segment (cyan & purple vectors): they can be used easily to generate a new road starting from any point;

a million better random selection, based on the formula: X1 = a*X0 + b % m;

This random generation merits a bit of attention.

Until now, i was randomly picking a new start point from an existing road to create a new road. The process is consuming, and there is no guarantee to avoid picking several time the same point on the same road.

With the formula above, found in the great numberphile channel (see below), I attribute once and for all a random to each road’s dot. The particularity of this random generation is that it will NEVER repeat two times the same value in one sequence. Once generated, my dots have a number in the range [0,1], with a linear distribution.

For instance, in a line having 10 dots (and therefore 9 segments), each dot will have a random number between 0 and 1. If you order the list of dots by random values, and compare the gap between each sorted values, the average gap will be 0.1!

The way to use this random number is straight forward. If you want to generate a secondary road on 50% of the dot of the first one, you just have to loop over these number and check wherever the random value of the dot is < 0.5. If the distribution was not linear, doing this would not guarantee to create on sub-road every two dots. As it is, you can just specify the percentage, all random calculation has already been done, and in a more controlled way then ofRandomuf() does it.

This formula requires big prime numbers (>10000) to be placed at a and b. Here is the source i used: list of primes.

After a bit of struggle with the management of a 3d grid, the advantage is there: it’s now easy to generate a road network in 3D, with automatic connection of the streets while generating them. It’s a simple brute force & unsupervised generation algorithm, but it is memory efficient and error less.

Just to explain a bit the images above: the cubes are the cells of the road grid. Only the required one are created during road generation. There are connected to each other to speed up the proximity tests once a now road segment is added. I’ll measure the generation time soon, but it’s already quite fast regarding to the first test made several days ago in processing.

It’s holidays in schools this week in belgium, a good occasion for me to pull out of the grave an old project: disrupted cities.

4 years ago, the scenes have been generated within blender: city maps, meshes and uvs. I was certain it wouldn’t be too hard to port in real-time, and that’s what i’ve started this morning! The planes are generated at runtime inside the engine, the uvs also, and after a rather long loading, everything runs nice and smooth (see the 2 screenshots above). A huge batch of highres textures, scans of a monotype, are necessary for the visual quality of the rendering.

The background is not full white, leaving some room for bright events.

The whole project is about wandering in an abstract city.

As it is holidays for students, i’ll spend mine working on this, thanks to the devs made since the beginning of the year.

Video is rendered with blender, but it will be done in real-time with polymorph engine in several days!!! 🙂 – currently generating the levels of the towers.

The error in vertices calculation is to have only one vertices at each corner on the bottom plane. This involves have only 1 uv coordinate for these vertices, where 2 are required – see uvunwrap here below.

Each joint can receive a configuration to deform the height of its faces. this will certainly be done via shader later.

That’s it for this week, back on monday for other adventures!

End of the week and a bit disapointed: i didn’t finished the joint generation. Manipulation of memory using OGRE_ALLOC_T, MemoryDataStream & DataStreamPtr is tricky. The idea is to generate the texture on the fly, for each new joint. Therefore, i couldn’t achieve the uv calculation… The top faces are ok, the other ones have to be done.

## Reply