from Hacker News

Midihum: An ML-Based MIDI Humanizing Tool

by erwald on 12/30/23, 11:30 AM with 18 comments

  • by DrSiemer on 12/31/23, 9:20 AM

    A before and after demo would help. I don't even have time to actually make music anymore, I'm not going through the trouble of installing this and then finding or creating a suitable midi file just to test it.
  • by cmiller1 on 12/31/23, 1:39 PM

    > Using midihum is easy. First clone the repository, navigate to the midihum/ directory, and install dependencies:

    > pip install -r requirements.txt

    Why not just upload it to pypi so you can do all this with one pip command?

  • by a-dub on 12/31/23, 5:56 AM

    cool! might be able to improve substantially with low effort using a simple approach to capturing dynamics like adding a few lags of the existing features.
  • by CrypticShift on 12/30/23, 12:15 PM

    I mean, you are already "performing"; it's not as if it is computer-generated (or drawn with the mouse), so it should already be "humanized." I believe the real issue lies with subpar MIDI keyboards that lack sufficient responsiveness. This is rarely a problem for me with guitars, where I can always dial in velocities with exactly the precision I have in mind (and I'm an amateur).

    FYI Another way MIDI is humanized in DAWs is in the temporal domain (i.e., rhythm/timing). This was popularized by Ableton's Groove Pool [1].

    [1] https://www.ableton.com/en/manual/using-grooves/