I recently uploaded the source code to the MIDI animator I programmed with jMonkeyEngine to github: midi-animator.
To use it, see the Readme there. You’ll need jMonkeyEngine, and understanding Java would probably help. (I’m not really interested in making a standalone user-friendly app at the moment; Stephen Malinowski’s “Music Animation Machine” is still available if you want that. I’m more interested in having something I can continually customize and play around with.)
In addition to perhaps being sloppy (as I never intended to share it), the source code is a bit bloated as it’s actually part of a larger project to create a MIDI editor that will feature my melody generator. But that’s a long way off; I’m not actively working on that at the moment, and probably won’t anytime soon.
So… there it is if anyone else wants to play around with it, or contribute their own improvements to it… feel free!
Also, it’s my first time uploading something to github, so I’m not very familiar with the platform yet… I hope I did it right.
I recently posted my latest composition, “Secrets of the Ancient Seas”! Check it out:
I write in the description:
This began as another track inspired by my novel, but the rapid string arpeggios and spirit of the melodies quickly began to remind me of an adventurer braving the seas, so I continued down that path instead. I even threw in some wind machine for some atmosphere, a percussion instrument in Garritan Personal Orchestra I’ve been wanting to try using but never really had the occasion for. I think it works well in this piece.
My favorite part of this piece comes at the 4:22 mark. At first I meant simply to contrast all the melodic material with some more atmospheric material, perhaps only wandering arpeggios, but I couldn’t resist adding some melodic phrases along with them in the form of descending minor thirds. With the minor chords forming the harmony, these descending minor thirds sound, to me, very haunting and creepy. Almost the way a child calls out “Where are you?” to taunt hiding prey. The sound of being lost at sea on a foggy night, perhaps? Vaguely hearing the call of the deadly sirens in the distance? Anyway, I love how it sounds.
I also like what’s happening harmonically, as it’s more chromatic than my usual fare:
We start in the tonic of B minor, then continually progress through the circle of fifths, to F-sharp minor, C-sharp minor, G-sharp minor, and finally to D-sharp minor. From here, we go back and forth between D-sharp minor and D major (the relative major of B minor), a transformation Neo-Riemannian triadic theory calls an S transformation for slide, as the chord slides between major and minor keeping the third of the chord as a common tone (in this case an F sharp). I think the major chord sounds particularly refreshing there, as so many minor chords precede it. Finally we get C-sharp major seventh for the final three measures, which serves as a secondary dominant in B minor (as it implies a resolution to F-sharp major, the dominant of B minor). But first the passage repeats, and the C-sharp major seventh is just as capable of resolving to B minor (although this resolution perhaps does not sound as strong, but that’s OK, the stronger resolution comes after the repeat).
When we do resolve to the dominant, F-sharp major, the opening phrase of the piece’s main melody is echoed, but it sounds rather exotic and dissonant being accompanied with the dominant chord rather than the tonic, and the clash propels the piece forward to the main melody’s final statements.
Although this little sequence is hardly revolutionary at all (and so may not stand out to any listener), it’s certainly not the sort of thing I’d usually compose, so I’m rather pleased with it.
Also, at long last I managed to upload a truly 60 fps animation thanks to Shotcut, a nice free video editor that will now replace my need for the annoying Windows Movie Maker. It’s not a super-advanced editor, but it does what I need (sync audio and add titles), it’s free, and it doesn’t come with annoying limitations to try to entice me to buy some deluxe version.
IBM’s Watson supercomputer AI has created a trailer for an AI horror film! Oh my! How interesting! How ironic! How impressive! IBM is full of geniuses! Let’s watch!
Erm… ok…
Alas, I am not at all impressed with the result. This trailer tells me hardly anything about the story. I fear we’ll have to wait until AIs actually “understand” language and story (or at least analyze these elements a bit more closely) before they can create trailers that resonate with humans. Who are the characters? What’s the main conflict of the story? What’s the spiritual (inner) conflict? What’s the hook? Etc. Trailers are not just a collection of tone shifts. What stupid investors are investing in IBM based on this sort of nonsense? (And how can I get some of their money myself?)
Anyway, what we end up with is not so much a “movie trailer created by AI” as though “AI” were some generic mysterious black box. Rather, it’s a movie trailer created in some algorithmic fashion that a human (or group of humans) designed. Which, of course, is what all “AI-generated” products amount to — human-created algorithms to mimic and automate processes we may not necessarily understand.
And therein lies the true goal of “AI research”. The point is not to create a robot that can do everything a human can do but remains just as mysterious as a human brain. The point is to understand what intelligence actually is in the first place. And when we understand that, we may find we don’t need or care about sophisticated human-like robots anyway. And any sort of creepy fear that comes from wondering about the possibilities of rogue robots or the nature of digital consciousness is the result of human idiocy, spiritually and logically. Spiritually in that consciousness is not merely an emergent property of matter (we are not just meat robots). Logically in that if we could design a robot capable of “going rogue” then we can just as easily design it to not “go rogue” in the first place.
“What if the AIs kill us?!” It’s already not that hard to make a machine that can kill you; why is a robot doing it somehow more scary? I suppose because you don’t understand where the “impulse” to kill is coming from. And anyway, if we’re smart enough to create robots that can actually decide to kill in some humanly way, then we’d naturally understand where that decision comes from in the first place and would prevent it (or override the capacity to decide not to kill if we’re making an evil robot army I guess).
(Of course some AI research is perfectly happy to stay within the bounds of mimicking and automating thought processes, as these algorithms can have useful applications, such as handwriting recognition software or my own forays into algorithmic music generation, which is ultimately music theory research.)
And let us not soon forget the actual screenplay written by an artificial neural network: