Few thoughts on the topic.
We need to come up with specification of how MIDI data translates into haptic signals/messages and make it universal and flexible (which can still be tricky).
Main points of interest:
- MIDI format is binary, can be embedded into MP4 as an audio track
- Lots of MIDI devices on the market and software for editing/viewing MIDI data (just google "midi controller")
- We can utilize hardware MIDI controllers for "control my sex toy" project
- MIDI can simultaneously control up to 16 different devices (for example: sex toy + gloves + body suit)
- Recording MIDI data is easy with existing DAW software (like Reaper, Ableton Live, Studio One, etc). It is focused on audio mainly, so video preview is limited but possible
- We can support both JSON and MIDI formats, create a converter tool
Limitations:
- 16 channels (e.g. 16 different devices per script)
- 31 message types with 14-bit precision (values from 0 to 16383), most of the data has 7-bit precision (value from 0 to 127).
- 7-bit value maps to the 0-1 range with 0.01 precision
- 14-bit value maps to the 0-1 range with 0.0001 precision
- Not really flexible format but usable (it was introduced in 80s)
There is room for discussion and priorities, but this idea has a great potential in my opinion in terms of "standardization" and usage of existing tech instead of reinventing the wheel.
We could start with "control my sex toy" and then apply this approach universally.