Tag Archives: Processing

Spatial Debris visualization in Scientific American

Some time ago Scientific American approached me to commission me for the creation of a visualization of spatial debris. And there is a lot of spatial debris floating around the earth. Think of abandoned rockets or broken satellites. The purpose of the visualization I created was to illustratie that there is a huge amount of  spatial debris, and not necessarily to provide an exact representation of it. In fact, not all the required data in order to determine exact position of debris and satellites was available in the dataset I received: right ascension and argument of perigee were missing, so I have used random numbers for those. But all the other data to calculate the orbit was there. In order to calculate the orbits of satellites and debris I had to apply Kepler’s Laws of Planetary Motion.

The graphic has been created in Processing. With all the orbits calculated and the satellites and debris positioned randomly on those orbits, the next thing was to get the color and positioning right. Positioning was rather easy, since it’s just applying some transformations to the image, which resulted in a nice perspective (circles nearby are larger than the ones further away). Coloring was the final step. Initially the idea was to color by country (US, USSR, China and Others), but this resulted in an image with colored dots all over. So to communicate a more focussed message, we decided to show the difference between active satellites (magenta) and spatial debris (black). As a nice extra, the ISS space station is also marked to get an even better sense of the amount of debris.

After completing the project, I played around with the data a little more, just to see if an animated version would have an even greater impact communicating the message. Well, judge for yourself…

Data Centric Universe: then and now

 

 

For Popular Science I have created a visualization that has been published in both the magazine, and as an interactive version for both the iPad and the web. The visualization shows the known universe of 1950 and todays known universe (2011). Since 1950 modern telescopes have been used, and 93% of the known universe has been discovered after 1950.

The visualization was built in Processing, and then turned into a zoomable image that uses OpenLayers.

Eyeo Data Visualization Challenge: Ghost Counties

About a day before the deadline I have submitted my entry for the Eyeo Data Visualization Challenge by Visualizing.org where the grand prize is a ticket to the brilliant Eyeo Festival.

You can see the final result here: http://www.janwillemtulp.com/eyeo.

The visualization depicts the number of homes and vacant homes for all the counties for each state. The size of the outer bubble represents the total number of homes, the size of the inner bubble represents the number of vacant homes. The y-axis shows the population size (on a logarithmic scale) and the x-axis of the bubbels shows the number of vacant homes per population. Each bubble is also connected with a line to another axis: the population / home ratio. On the top right you can see some exact numbers for this data.

This time I built the visualization in Processing, mainly because I expected to work with large datasets from the US Census Bureau and I might had to use some OpenGL for better performance. Eventually I didn’t use OpenGL. Building the visualization in Processing was lots of fun. To get sense of the data I tried as many as 5 completely different approaches. Here are some of the sketches that eventually led to this visualization (view this selection on my Flickr stream).

The data itself was not very complex, but rather big, and the biggest challenge was to find a creative approach to visualize this data, but without using a map (which would be rather obvious since it’s about locations).