Animation in Blender with MIDI and Python

The Project

One of my first inspirations as I was learning how to use Blender was the videos created by Animusic. Ever since I have wanted to incorporate MIDI data into 3D animation. Now that I have experience in Blender and Python, I think it’s time.

The goal is to create a system that will automate a piano keyboard based on any MIDI file. While some compromises had to be made and the workflow could use some improvement, I was able to come up with a solution that worked in a few weeks. I am sure I will revisit this and hopefully fully integrate it into Blender, but for now here’s the project and code.


The Scene

The simplest way to integrate a scripted animation will be a series of indexed empties that will drive all our other animations, including material changes. This can be done simply with a few lines of Python code once your system is designed.

My scene at this point consisted of a single note piano action that was fully rigged inside a collection named “0”. I then simply duplicated and renamed it with a Python script until I had the full range to cover all the MIDI notes (the MIDI range goes up to 127).

While the process was slightly more involved (rigging deformations to move keys around supports, and changing the keys etc), this is the general idea.

The code

DISCLAIMER: This code is quick and dirty and may not work as expected for you. This resource is only intended to document my process. However you are still free use what you can in your own projects, at your own risk

Now that the scene is structured how we want, lets get to work making it move!
While it would likely be possible and more efficient to use one single script from within Blender, I needed the flexibility to be able to run this without having to set up an environment with the necessary libraries. All data I fed Blender had to be readable by a vanilla installation/archive.

midi-parse.py

import mido
import json


def get_rate(ppq, t):
    ms_per_tick = 60000 / (t * ppq)
    tick_per_sec = (ms_per_tick * 1000)
    return ms_per_tick, tick_per_sec


mid = MidiFile('YOUR_MIDI.mid', clip=True)

data = {}
json_data = {}
ppq = mid.ticks_per_beat
quarter_duration = tick2second(ppq, ppq, 500000)
ticks_per_second = second2tick(1, ppq, 500000)
tempo = 120
c_time = 0
abs_time = 0

for z, t in enumerate(mid.tracks):
    for i, m in enumerate(t):
        print(m)
        i_pad = str(i).rjust(4, '0')
        event = 'event_' + i_pad
        data[event] = {}
        d = data[event]

        if m.type in ("note_on", "note_off"):
            c_time += m.time
            d['type'] = m.type
            d['note'] = m.note
            d['velocity'] = m.velocity
            d['time'] = m.time
            d['all_time'] = c_time

        elif m.type == "control_change":
            c_time += m.time
            d['type'] = m.type
            d['control'] = m.control
            d['value'] = m.value
            d['time'] = m.time
            d['all_time'] = c_time

        elif m.type == "set_tempo":
            c_time += m.time
            TEMPO = tempo2bpm(m.tempo)
            d['type'] = m.type
            d['tempo'] = TEMPO
            d['all_time'] = c_time
            print(d['tempo'])

        else:
            d['type'] = 'empty'
            d['time'] = 0
            d['all_time'] = c_time

    json_data[str(z)] = data

tickrate = get_rate(ppq, TEMPO)

with open('YOUR_JSON.json', 'w') as outfile:
    json.dump(data, outfile, indent=4)	

This exports an easier to access .JSON file that can be used natively in Blender. Here is an excerpt to show how that looks.

{
    "event_0000": {
        "type": "empty",
        "time": 0,
        "all_time": 0
    },
    "event_0001": {
        "type": "note_on",
        "note": 55,
        "velocity": 54,
        "time": 2773,
        "all_time": 2773
    },
    "event_0002": {
        "type": "note_on",
        "note": 47,
        "velocity": 42,
        "time": 10,
        "all_time": 2783
    },
    "event_0003": {
        "type": "note_on",
        "note": 40,
        "velocity": 51,
        "time": 13,
        "all_time": 2796
    },
    "event_0004": {
        "type": "control_change",
        "control": 64,
        "value": 127,
        "time": 2,
        "all_time": 2798
    },
    "event_0005": {
        "type": "note_on",
        "note": 52,
        "velocity": 42,
        "time": 9,
        "all_time": 2807
    },
    "event_0006": {
        "type": "note_off",
        "note": 52,
        "velocity": 64,
        "time": 2143,
        "all_time": 4950
    }
}

json-to-img.py

from PIL import Image
import numpy as np
import json
from pathlib import Path
import array


def get_ticks_per_frame(bpm, ppq, fps):
    tps = ppq * (bpm / 60)
    tpf = tps / fps
    return tpf


def get_time(t, d):
    new_time = round(t / d)
    return new_time


def get_frame_img(img_data, frame, frame_image, rows):
    for r in range(rows):
        for n in img_data[frame + r]:
            frame_image[r, n] = img_data[frame + r, n]
    return frame_image


path = Path(__file__).parent / "./YOUR_JSON.json"
with path.open() as f:
    data = json.load(f)

white = [255, 255, 255]
black = [0, 0, 0]
rows = 64
ppq = 480
bpm = 120
fps = 60
tpf = get_ticks_per_frame(bpm, ppq, fps)
tick_divisor = tpf

prev_val = array.array('i', (0 for i in range(0, 127)))
new_val = array.array('i', (0 for i in range(0, 127)))

note_val = {i:0 for i in range(0,127)}
note_offset = 20

for e in data:
    max_time = data[e]['all_time']

img_w, img_h = 88, round(max_time / tpf)
img_data = np.zeros((img_h, img_w, 3), dtype=np.uint8)
frame_image = np.zeros((rows, img_w, 3), dtype=np.uint8)

for e in data:
    if data[e]['type'] == 'note_on':
        n = data[e]['note']
        t = get_time(data[e]['all_time'], tick_divisor)
        img_data[t, n - note_offset] = white
        note_val[n] = t
    if data[e]['type'] == 'note_off':
        n = data[e]['note']
        t = note_val[n]
        new_t = get_time(data[e]['all_time'], tick_divisor)
        for l in range(t, new_t):
            img_data[l, n - note_offset] = white

        note_len = new_t - t

img = Image.fromarray(img_data, 'RGB')
img.save('full_range.png', interpolation='NEAREST')
img.show()

max_frame = round(max_time / tpf)

for frame in range(max_frame):
    if frame + rows < max_frame:
        area = [0, frame, 88, frame + rows]
        img.crop(area).save('./RENDERS/Frames/frames_' + str(frame) + '.jpg', interpolation='NEAREST')

This script exports a complete .PNG image that contains every MIDI event, and then exports one frame slices of the defined range to generate a scrolling image sequence. These can then be unwrapped onto an object in Blender like any other texture.

midi-animator.py

import bpy
import json
from pathlib import Path


def get_ticks_per_frame(bpm, ppq, fps):
    tps = ppq * (bpm / 60)
    tpf = tps / fps
    return tpf


def add_time(c, t):
    c += t
    return c

 
def get_frame(t, tf):
    frame = (t / tf)
    return round(frame)


def set_blank_keys(n, f):
    obj = bpy.data.objects[str(n)]
    obj.keyframe_insert(data_path='rotation_euler', frame=f)
    obj.keyframe_insert(data_path='color', frame=f)


def set_anim_keys(n, f, v, m, vel):
    # define animation target object
    obj = bpy.data.objects[str(n)]

    # set amplitude of motion
    amp = v * (0.075)
    col = v * (vel / 127)
    
    # set animation keyframes
    obj.color = (v, v, v, 1)
    obj.rotation_euler = (obj.rotation_euler[0], amp, obj.rotation_euler[2])
    obj.keyframe_insert(data_path='rotation_euler', frame=f)
    obj.keyframe_insert(data_path='color', frame=f)


def note_on(n, v, e, t):
    ease_frames = e - round((v / 127) * e)
    frame = get_frame(t, ticks_per_frame)

    set_blank_keys(n, frame - ease_frames)
    set_anim_keys(n, frame, 1, velocity_multiplier, v)


def note_off(n, v, e, t):
    ease_frames = e - round((v / 127) * e)
    frame = get_frame(t, ticks_per_frame)

    set_blank_keys(n, frame)
    set_anim_keys(n, frame + ease_frames, 0, velocity_multiplier, 0)


def pedal_change(v, t):
    frame = get_frame(t, ticks_per_frame)


# Global variables
path = Path(__file__).parent / "../midi_json_ver2_2.json"

ppq = 480
bpm = 120
fps = 60
 
c_time = 0
ease_multiplier = 6
velocity_multiplier = 4
ticks_per_frame = get_ticks_per_frame(bpm, ppq, fps)

# Open json data file
with path.open() as f:
    data = json.load(f)

for e in data:
    # Increase global tick count
    if 'time' in data[e]:
        c_time = add_time(c_time, data[e]['time'])
    # Check if event is note_on, note_off, or control_change (pedal)
    if data[e]['type'] == 'note_on':
        note_on(data[e]['note'], data[e]['velocity'], ease_multiplier, c_time)
    if data[e]['type'] == 'note_off':
        note_off(data[e]['note'], data[e]['velocity'], ease_multiplier, c_time)
    if data[e]['type'] == 'control_change' and data[e]['control'] == 64:
        pedal_change(data[e]['value'], c_time)

This file is run from Blender, and will animate your range of objects (designated as ‘n’ which is the midi notes numerical value) based on a determined amplitude. This makes it easy to repurpose the script for other setups. My initial project was just keyboard, so the master rotation was well under one radian. Once I added an entire action the rig required a much greater master rotation.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s