Apps, Mobile & SMS

Converting Weather Into Music With Dark Sky, Spotify For Tycho

Screen Shot 2019-08-25 at 10.58.56 AMIn a unique algorithmic twist, Tycho recently teamed up with Lee Martin to craft an app which employs the user's local weather forecast in order to create custom weather-based playlist. Here Martin walks us through the process of the app's creation, and how it works.

__________________________________

Guest post by Lee Martin. This article originally appeared on Medium

Yesterday, Tycho and I released Forecast, an app which uses your local weather forecast to generate a unique Spotify and Apple Music playlist. Not only was this a fun developer challenge, it was also an opportunity to work with Scott (Tycho) directly on design. I think the concept works because it meets each of the three principles I strive for: Simple, Accessible, and Magic. It’s simple in that the user merely needs to tap the sun in order to invest themselves in the app. It’s accessible via the web from a url. No downloading needed. And it turns the fucking weather into music! That’s magic.

The sweet design doesn’t hurt either.

These days users want to put themselves within the app and location is one of those readily available variables which can make the experience more unique. I’ve used it to similar effect when terrifying Manson’s following and allowing Khruangbin fans to generate music for upcoming flights. Just don’t be creepy with location data.

So, how does one turn the weather into music? Read on to find out.

Getting the Forecast

Cloudy and rainy

Icons by Tycho

It all starts with a pair of coordinates which you can get by using the Geolocation API. In addition, I’m using the MaxMind GeoIP service as a fallback in case something goes wrong. MaxMind will turn the user’s IP address into a location which is a solid fallback for a weather app.

navigator.geolocation.getCurrentPosition(response => {
// response.coords.latitude
// response.coords.longitude
}, error => {
geoip2.city(response => {
// response.location.latitude
// response.location.longitude
}, error => {
// ok, something really went wrong
})
})

With coordinates now available it’s time to get the forecast. Dark Sky is a popular weather app (the one I use) which also has an excellent and affordable API for getting past, current, and future weather conditions. I’ve actually used it at least once before on the Purple Rain Report. The endpoint we’re most interested in is forecast which takes a pair of coordinates and returns the weather. But first, let’s talk about my setup.

I am a recent Netlify convert and I simply can’t get enough of how this platform has streamlined my workflow. I’m able to run their platform right from my computer using Netlify Dev which keeps all my environment variables synced. In addition, Netlify also allows you to easily deploy serverless functions which can also be tested locally with Dev. Functions are a perfect solution to meet Dark Sky’s security requirement of making calls via your server. Ya know, without the server.

To initialize a function, I might use the Netlify CLI command:

netlify function create --name forecast

With your function created, you can write it very simply. I’m using the node-fetch module which brings window.fetch to Node.js to hit the Dark Sky API with a pair of coordinates passed as parameters to the Netlify function. Here’s a gist to get you going.

Finally, if you’re using Netlify Dev, you can test your function locally by simply sending a request to the appropriate endpoint. In the case of a function called “forecast,” it would be accessed from /.netlify/functions/forecast. So, once I obtain those coordinates, I can send a fetch request from my client to receive the user’s current weather conditions.

fetch(`/.netlify/functions/forecast?latitude=${lat}&longitude=${lon}`)
.then(response => response.json())
.then(data => {
// data.currently
})

Correlating Weather to Audio Features

Sun swipe

Dark Sky sends over a lot of interesting data about the current weather. The next step is to relate that data somehow to features of an audio track. In previous projects such as Sadboi Detector and AirKhruang, I have utilized the track audio features Spotify provides such as valence and energy. In the case of this app, we looked at several audio features and simply chose which weather measurement might symbolically relate. For example, a gusty day might produce tracks with a higher tempo. Or the temperature might affect the valence or mood of the songs. We took liberty in these connections and just had fun with out.

Most of these properties on Spotify run from a scale of 0.0 to 1.0. Somehow I was going to need to create a similar scale for the weather measurements being utilized. I ended up researching the a average, max, and minimum of each of these and then used a normalization function to convert them into the scale I desired.

let normalize = (val, min, max) => {
return Math.max(0, Math.min(1, (val - min) / (max - min)))
}

For example, if the average wind speed in the United States is 9mph, I could say that is 0.5 on our scale. Making 0mph equal to 0.0 and 18mph equal to 1.0. It’s not very scientific but it works. Once you do this for everything, you’re left with an array of data which represents all of the weather properties, the audio feature they relate to, and their normalized value.

let measures = [
{
property: 'wind',
feature: 'tempo',
value: normalize(wind, 0, 18)
}
]

I then loop through each of these measures, sorting the entire song list by feature values which are nearest to the normalized weather value. This sorted songs list is then used to increment the score of a song depending on its position in the stack. This process is continued for each measurement until all song scores are adjusted accordingly.

measures.forEach(measure => {
songs.sort((a, b) => {
let aValue = a.features[measure.feature]
let bValue = b.features[measure.feature]
return Math.abs(aVal - measure.value) - Math.abs(bVal - measure.value)
})
songs.forEach((song, i) => {
let score = _.round(1 - (i / (songs.length - 1)), 3)

song.score += score
})
})

Songs are sorted by score and then the app takes the 25 highest scorers. Finally, the the selected tracks are shuffled, producing a playlist.

songs.sort((a, b) => {
return a.score - b.score
}).reverse()
let playlist = _.shuffle(_.take(songs, 25))

It’s worth mentioning that I integrated Contentful as the system where we manage our pool of songs. The client was also provided with two UI extensions for easily searching Spotify and Apple Music songs from within the comfort of Contentful. Check out this blog for info on how I integrate Contentful into my app framework of choice Nuxt.js. In addition, I wrote an article on extending Contentful to include streaming service search fields.

Design and Transitions

Scott and I worked through the design in just two phone calls.

On the first call, I explained how this app like others I have developed should be very simple to understand and quick to use. A utility. I explained the issues surrounding responsive design and recommended a mobile first approach. While I’m no expert on transitions, I told Scott to go nuts and we would figure everything out as soon as he presented a vision. On the second call, Scott presented this wonderful design in Photoshop and walked me through some thoughts of how the transitions could work, dragging elements with the move tool. He truly kept it simple, focused, and fun.

Rather than jump straight to code, I decided to open up Framer Classic as a quick way to prototype some of the transitions. I broke that work up into three chunks: the intro, the creation sequence, and reveal of the save buttons. This allowed us to easily test ideas about transitions and timing without getting bogged down by too much code. Once we were confident in the general approach, I then jumped to CodePen.IO and scripted the entire animation sequence as one long playback using the incredible animation library Anime.js. Only then did I attempt to start integrating it into the flow of our application.

Anime has this great Timeline function which allows you to string multiple element animations together. For example, if we think abstractly for a moment, the frontpage consists of three elements: the sky, water, and sun. We know we want the water to rise in from the bottom and then have the sun scale in from the center horizon. First, we lay it out.

I experimented with several different HTML layouts but decided to contain the sky and water in a flexed box and position the sun absolute in the dead center of the page. Here’s some simplified code:

<main>
<div id="sun"></div>
<section id="sky"></section>
<section id="water"></section>
</main>
<style>
main{
display: flex;
flex-direction: column;
}
div#sun{
height: 50vw;
left: 50%;
position: absolute;
top: 50%;
transform: translate(-50%, -50%) scale(0);
width: 50vw;
}
section#sky{
flex: 1;
}
</style>

That flex: 1 allows the sky to take up the full height of the page initially and I don’t think I need to tell you what scale(0) does to the sun. With this layout, we can then script our Anime timeline.

let tl = anime.timeline()tl.add({
targets: '#water',
height: '50%'
})
tl.add({
targets: '#sun'
scale: 1
})

Pretty nice, right? What about a pulsating sun? Try adding a loopparameter and setting the direction to alternate.

let pulse = anime({
targets: '#sun',
scale: 0.9,
direction: 'alternate',
loop: true,
easing: 'linear'
})

If you declare it, like I did above with pulse, you also have some nice controls are your disposal like anime.pause(). Ok, what about that pie swipe effect? Well… that took a few tries. My first attempt utilized a rotating divwhich “hid” a section by the sun by actually just showing sky. I then got fancy and integrated the poorly supported (but coming) conic gradient until someone told me my sun disappeared on Firefox. This led me to revisit my best friend, HTML5 canvas.

In addition to animating dom nodes, Anime will allow us to animate any old javascript object. So, if I combined that with the HTML5 canvas arcfunction, I might have something. Here’s what that looks like.

let sun = { hidden: 0 }anime({
targets: sun,
duration: 500,
hidden: 180,
easing: 'linear',
update: () => {
let canvas = document.getElementById('canvas')
let context = canvas.getContext('2d')
context.fillStyle = 'yellow' context.clearRect(0, 0, canvas.width, canvas.height) context.beginPath()
context.moveTo(canvas.width / 2, canvas.height / 2)
context.arc(
canvas.width / 2,
canvas.height / 2,
canvas.width / 2,
0,
hidden * (Math.PI / 180),
true
)
context.closePath()
context.fill()
}
})

Thanks

Thanks again to all the fine folks at Mom + Pop, Another Planet, and NinjaTune who helped make this experience happen. It was an absolute pleasure working with this crew the last few weeks. And thanks to Scott for believing in the concept and then providing the whole damn design to make it happen. Tycho “Weather” is out now.

Share on: