Dienstag, 12. März 2019

Milling Fractals with Datashader

Recently we learned about datashader, a visualization library designed to work with very large datasets. It is build on the fast computing library dask and the just-in-time compiler numba. While reading their documentation we found an example page, with beautiful fractals created by plotting large trajectories of numerical systems. We decided to create some fractals of our own.

An attractor is a function 𝑓:22 that maps coordinates on new coordinates. Applying the function again and again results in a trajectory of coordinates. When we create large enough trajectories we'll find that some spots are getting visited more often by the trajectory than others. Visualizing the trajectory while coloring the spots according to how often they are getting visited can result in beautiful fractals.




We define an attractor function which depends on parameter constants. And we define a trajectory function, which calls the attractor function for a fixed number of iterations and uses the output of each iteration as input for the next iteration. The attractor function we use is the Clifford Attractor


𝑥𝑛+1=sin(𝑎𝑦𝑛)+𝑐cos(𝑎𝑥𝑛)
𝑦𝑛+1=sin(𝑏𝑥𝑛)+𝑑cos(𝑏𝑦𝑛)


A great thing about attractor functions is that you can get many different fractals depending on the parameters of your function. We decided to try out some parameter sets to see what we can create. And we had all our code in a jupyter notebook. While notebooks are great for combining code, results and documentation they are not as easy to run as a simple python file.

However that does not mean we can't do it. A while ago we learned that Netflix is using jupyter notebooks quite a lot. Apparently they have an entire system built around notebooks that are being containerized and scheduled to run. They have an inspiring blog post about it.

One of the packages they're using is papermill.It allows us to parameterize a notebook by simply tagging cells with parameter definitions. Then we can run the notebook from either a command line or from python and store results in different notebooks.

$ papermill fractal.ipynb output.ipynb -p c 1.2 -p d -1.2 -p e 0.4 -p f -0.4                         
Input Notebook:  fractal.ipynb
Output Notebook: output.ipynb
100%|████████████████████████████████████████████████████████████████████| 15/15 [00:11<00:00,  1.22s/it]
The notebook we used is on our github.

Keine Kommentare:

Kommentar veröffentlichen