Dylan Freedman

Hi! I’m Dylan, a full-stack software engineer, designer, and journalist. I strive to code new tools to help journalists and the public.

I work at MuckRock, where I lead development of DocumentCloud, a platform for journalists to upload, search, analyze, annotate, and share documents.

Previously, I worked as a software engineer at Google, researching and developing websites in Machine Perception. In the past, I’ve taught data journalism and studied computational journalism, computer science, and music.

To get in touch, feel free to reach out at freedmand@gmail.com.

Projects

I pursue many projects on and off work. Here are some highlights. For reporting work, see Media below.

Poly

Blog: poly.dev

Poly is a new programming language for the web I’m currently working on as a personal project. The language is in early stages and is designed to compile to full web applications in HTML, CSS, and JavaScript. I’m implementing the language using ReasonML.

Showcase image for PolyShowcase image for Poly

Covid Map

Website: covid19map.us

Covid Map is an interactive, explorable, and zoomable map of current and historical COVID-19 cases and deaths in the United States. The site utilizes deck.gl, a WebGL-based library for displaying large datasets, to performantly map each county’s data. The data is sourced from the New York Times’ open source covid data. First published in March 2020, the site automatically pulls in data updates every day using a custom Google Cloud functions workflow. Source code.

Showcase image for Covid MapShowcase image for Covid Map

Website: planet.gallery

Planet Gallery is a virtual showcase of every known exoplanet displayed as a countour plot of its surrounding starfield. This work was done in collaboration with Lawrence Peirson. The site was featured in Stanford’s Art of Science 2020 Exhibition.

Showcase image for Planet GalleryShowcase image for Planet Gallery

DataJourn

Course Website: datajourn.com

In fall 2019, I taught an undergraduate course called Data Journalism at Temple University. The course is divided into three parts: data, visualization, and code. Students learned how to 1) find datasets and analyze them using spreadsheets, 2) design on paper and create graphics in Figma, and 3) code basic HTML, JavaScript, and CSS to put together interactive webpages.

Course content includes a visual guide on how pivot tables work, an interactive bridge-building game to learn HTML, and a Halloween-themed puzzle to find coding mistakes.

Showcase image for DataJournShowcase image for DataJournShowcase image for DataJourn

Ripple Plastic

Website: rippleplastic.com

Ripple Plastic is a virtual reality experience I created in conjunction with award-winning photographer Mandy Barker and the Stanford Journalism Program. The interactive exploration highlights the growing plastic pollution in the world’s oceans by juxtaposing plastic debris Barker photographed on beaches with a guided narration.

I coded the experience using the A-Frame Web.VR framework (source code). Steps involved include 1) mapping 2D photoshop layers into 3D objects using a customly designed Blender pipeline, 2) creating a layout algorithm to place objects randomly on a sphere using a combination of Perlin noise and Poisson-disc sampling, 3) coordinating production with fellow journalism students, and 4) composing custom theme music for the experience. I wrote a blog post about the process. The experience premiered at the Our Plastic Ocean exhibition at Impressions Gallery, England in 2019.

Showcase image for Ripple PlasticShowcase image for Ripple PlasticShowcase image for Ripple Plastic

Inferactive

Website: inferactive.org

Inferactive is an interactive data tool for finding insights and inferences I made as part of my master’s thesis. The entirely web-based platform features a four-step analysis and discovery flow: 1) upload CSV data or select an example dataset, 2) refine by selecting columns to include/exclude from a table/detail view, 3) view statistics and one-dimensional charts for each data column, and 4) discover insights and trends in the data by viewing a shuffleable assortment of auto-generated plots correlating two data columns.

The project is written in Svelte and geared for frontend performance and the ability to handle large datasets.

Showcase image for InferactiveShowcase image for InferactiveShowcase image for InferactiveShowcase image for Inferactive

Sounds

Observable Notebook

“Sounds” is an interactive sound wave primer I wrote for fun in 2018. This online computational essay guides readers through the basics of sound wave theory. Part 2 delves into more experimental sound functions and 8-bit chiptunes.

Showcase image for SoundsShowcase image for SoundsShowcase image for Sounds

Tapcompose

Website: tapcompose.com

“Auto-complete for music composition.” Tapcompose is a web app I built as part of a Stanford course that lets anyone write sheet music, bar by bar, with auto-generated suggestions. The web app features dynamic sheet music generation and a synced playback bar, with sounds generated using the Web Audio API in-browser. Source code. YouTube demo.

Showcase image for TapcomposeShowcase image for TapcomposeShowcase image for TapcomposeShowcase image for Tapcompose

AudioSet

Website: research.google.com/audioset

AudioSet is a static website I designed at Google to showcase an ontology of sound events and collection of over 2 million manually annotated YouTube clips. The website presents all the YouTube clip thumbnails, dynamically requesting more as the user scrolls using JSONP. I used Closure Templates to build reusable components for static pages. I also contributed to the paper behind the website, which was presented at the 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) in New Orleans.

Showcase image for AudioSetShowcase image for AudioSetShowcase image for AudioSetShowcase image for AudioSet

Sonority

Website: sonority.io

Sonority is an interactive visualization of the harmonic similarity between all the Beatles’ songs. I built the graphic using D3.js. Using a custom Python processing script, I statically generated thousands of audio clips representing the most aligned excerpt of each pair of songs. This work was part of my undergraduate thesis which I presented at the 2015 International Society for Music Information Retrieval (ISMIR) conference in Malaga, Spain.

Showcase image for SonorityShowcase image for Sonority

Media

1:46.45 — The Unbroken National High School 800m Record

For this report, I interviewed record-holder Michael Granville, whose 1996 race performance at the semifinals of the California State Meet has yet to be matched. Beyond his running performance, the video explores his relationship with his father who coached him and unearths never-before-uploaded footage of the actual race. The video features original music I composed in high school as an 800m athlete.

Elusive: Recording the Secret Lives of Bay Area Mountain Lions

Article in SFGate

In recent years, residents of the Santa Cruz mountains have organized social media communities on Facebook to share trail camera footage of their wooded backyards. The shared media reveals just how prevalent, and elusive, mountain lions are in the bay area. I interviewed leaders of the social media groups and produced a video report and article.

Fear meets Fire

Public Radio International (PRI) Article

An article I co-authored with Jackie Botts on the impact of the 2017 “Wine Country” fires on undocumented immigrants. The article got picked up by PRI’s The World and features a written component and video report.

Analysis of the 2016 Election Forecasts

Interactive Data Visualization

An independent news article and interactive data piece showcasing the different forecasts from primary news outlets for the 2016 presidential election and how they differed from actual election outcomes. Coded in vanilla JavaScript.

Showcase image for Analysis of the 2016 Election ForecastsShowcase image for Analysis of the 2016 Election Forecasts