Presented by:

I am 14 years old for snapcon 2021. I am in middle school, which means I have plenty of time. I am glad I am not in high school for COVID-19. I will be going into the IB program next year. Last year, along with two people, I ran an experiment in Astro Pi where we measured the vegetation around cities, as opposed to far from cities. My talk is about 3d rendering with raycasting.

Last year, snap got a ray-length block. Using that, we could make a 3D image, and even with curved walls. However, that system had the problem that it was still only a 2D environment, without the ceiling and floor heights of DOOM. Based on that, I decided to create a program which does a similar thing, but in two dimensions. It choosing a colour for each pixel depending on the distance to the image, and the complexity of that location. The advantage of this system is that it can render images with many shapes quickly, as long as there is a simple pattern. In fact, it can render self-similar fractals without any additional stress. At the moment, I can generate a sphere, a tetrahedron, and a Sierpiński triangle. The renderer uses the distance and moves forward exactly that much, but it can also cut it off at whatever distance, which can be used to create additional width. This is required for the Sierpiński triangle to even be visible, but has to be minimized for the shells. At the moment, lighting is pillow shaded, like in the original ray casting demo, but any calculation can be done from the shape, such as the depth of the fractal at that point. While not complete, this project allows full 3d rendering, to the same appearance as with the ray-length block.

Duration:
5 min
Room:
Plenary
Conference:
Snap!Con 2021
Type:
Lightning Talk
Track:
Lightning Talks