kshoji / ble-midi-for-android Goto Github PK
View Code? Open in Web Editor NEWMIDI over Bluetooth LE driver for Android 4.3 or later
License: Apache License 2.0
MIDI over Bluetooth LE driver for Android 4.3 or later
License: Apache License 2.0
Currently, the received MIDI event will be processed in asynchronously.
As a result, the MIDI events are handled out of order.
With adding the event processing queue structure, MIDI events will be processed correctly.
Just putting it out there, for people who're trying to use peripheral on 0.0.9, use the develop branch and version 0.0.11
My BLE device wasn't connecting
Android Lollipop(5.0) have BLE peripheral feature APIs.
Using these APIs, Android device will be able to emulate BLE MIDI device.
When using Android devices with the library, they can't establish connection.
Debugging environment:
Add Service
implementations to use the BLE MIDI on several Activities or Fragments.
NullPointerException
or DeadObjectException
are thrown at BleMidiPeripheralProvider.startAdvertising
method.
I use your code partially but now I think I need to just rewrite it completly..
Issue is that you made it suitable just for your use case I believe. When user advertise and somone connects lib immediately starts thread to deliver messages in BleMidiParser , Lib hides completly stop method also. But this is wrong as in some cases you want to connect but not start processing mesages until is neccesary. Imagine user connect multiple devices and then just choose wich one to use for input and some other for output so you dont want everything to run all the time. This is important for performance reasons mostly in apps that already are using system a lot, like I do in midi sequencer and real time audio processing apps with a lot of threads I cannot afford anything uselsss runnign in the background as I get performance issues on slow devices already.
Your MidiInputDevice should have open and close function to be run manually not automaticaly by library. Or maybe start and stop.. so it can be started and stopped repeatedly.
MidiParser
field, but asynchronous event firing overwrites the fields.It was ok when install in Android 4.4 but the phone cannot run peripheral.
Is sample updated and working ? I am on development branch and my setup definitely works when I use other app from one other project but there I have just example for central and I need to implement both. Witch branch should we use now ? master looks like quite old 2015 but there there was a work on development branch so ...
This library requires API Level 19. This is because of Bluetooth pairing feature.
Consider to enable BLE MIDI on API Level 18 without using pairing feature.
Apple's BLE MIDI Specification includes the MIDI data's timestamp.
But currently, this library ignores the timestamp attribute.
My BLE MIDI device currently can't parse SysEx message longer than 14 bytes.
If the longer SysEx comes, it ignores the message.
When this issue is fixed by the future firmware update, I'll test and modify library's code.
I tried the sample app from the Google Play market on an Android 4.4.3 phone and I was not able to receive MIDI messages on Yosemite, the devices are connecting but I don´t receive any note when sendig them from the phone.
I also used the PacketLogger from Hardware IO Tools for Xcode 7 beta 5 to debug the connection and I can see the initial negotiation packets and after no packets with the note, when connecting to Yosemite with an IOS device though I can receive the MIDI notes.
Any hints on what could be the problem? Are there any updates related to this that are not included in the version from the Market? I didn´t tried to compile the code from github.
Thanks
すいません。
教えてください。
OnMidiInputEventListener2をつくって、バイト配列に 復号しようとしているのですが、
最後のRPNとNRPのところ、7ビットか14ビットか判定できないので、複合できません。
パラメータに7ビットか14ビットか教えるものが欲しいです。
これであっているか、教えていただけますと幸いです。
`OnMidiInputEventListener2.java
package jp.kshoji.blemidi.listener;
import androidx.annotation.NonNull;
import org.star_advance.midimixer.libs.midi.MXMidiStatic;
import jp.kshoji.blemidi.device.MidiInputDevice;
public abstract class OnMidiInputEventListener2 implements OnMidiInputEventListener{
public abstract void onSendData(@nonnull MidiInputDevice sender, @nonnull byte[] data, int offset, int count);
protected final void onSendData(@NonNull MidiInputDevice sender, int command) {
byte[] data = new byte[1];
data[0] = (byte)command;
onSendData(sender, data, 0, data.length);
}
protected final void onSendData(@NonNull MidiInputDevice sender, int command, int data1) {
byte[] data = new byte[2];
data[0] = (byte)command;
data[1] = (byte)data1;
onSendData(sender, data, 0, data.length);
}
protected final void onSendData(@NonNull MidiInputDevice sender, int command, int data1, int data2) {
byte[] data = new byte[3];
data[0] = (byte)command;
data[1] = (byte)data1;
data[2] = (byte)data2;
onSendData(sender, data, 0, data.length);
}
@Override
public void onMidiSystemExclusive(@NonNull MidiInputDevice sender, @NonNull byte[] systemExclusive) {
onSendData(sender, systemExclusive, 0, systemExclusive.length);
}
public static final int COMMAND_CH_NOTEOFF = 0x80;
public static final int COMMAND_CH_NOTEON = 0x90;
public static final int COMMAND_CH_POLYPRESSURE = 0xa0;
public static final int COMMAND_CH_CONTROLCHANGE = 0xb0;
public static final int COMMAND_CH_PROGRAMCHANGE = 0xc0;
public static final int COMMAND_CH_CHANNELPRESSURE = 0xd0;
public static final int COMMAND_CH_PITCH = 0xe0;
public static final int COMMAND_SYSEX = 0xf0;
public static final int COMMAND_MIDITIMECODE = 0xf1;
public static final int COMMAND_SONGPOSITION = 0xf2;
public static final int COMMAND_SONGSELECT = 0xf3;
public static final int COMMAND_F4 = 0xf4;
public static final int COMMAND_F5 = 0xf5;
public static final int COMMAND_TUNEREQUEST = 0xf6;
public static final int COMMAND_SYSEX_END = 0xf7;
public static final int COMMAND_TRANSPORT_MIDICLOCK = 0xf8;
public static final int COMMAND_F9 = 0xf9;
public static final int COMMAND_TRANSPOORT_START = 0xfa;
public static final int COMMAND_TRANSPORT_CONTINUE = 0xfb;
public static final int COMMAND_TRNASPORT_STOP = 0xfc;
public static final int COMMAND_FD = 0xfd;
public static final int COMMAND_ACTIVESENSING = 0xfe;
public static final int COMMAND_META_OR_RESET = 0xff;
@Override
public void onMidiNoteOff(@NonNull MidiInputDevice sender, int channel, int note, int velocity) {
if (channel < 0 || channel >= 16) {
return;
}
if (note < 0 || note >= 128) {
return;
}
if (velocity < 0 || velocity >= 128) {
return;
}
onSendData(sender, COMMAND_CH_NOTEOFF + channel, note, velocity);
}
@Override
public void onMidiNoteOn(@NonNull MidiInputDevice sender, int channel, int note, int velocity) {
if (channel < 0 || channel >= 16) {
return;
}
if (note < 0 || note >= 128) {
return;
}
if (velocity < 0 || velocity >= 128) {
return;
}
onSendData(sender, COMMAND_CH_NOTEON + channel, note, velocity);
}
@Override
public void onMidiPolyphonicAftertouch(@NonNull MidiInputDevice sender, int channel, int note, int pressure) {
if (channel < 0 || channel >= 16) {
return;
}
if (note < 0 || note >= 128) {
return;
}
if (pressure < 0 || pressure >= 128) {
return;
}
onSendData(sender, COMMAND_CH_POLYPRESSURE + channel, note, pressure);
}
@Override
public void onMidiControlChange(@NonNull MidiInputDevice sender, int channel, int function, int value) {
if (channel < 0 || channel >= 16) {
return;
}
if (function < 0 || function >= 128) {
return;
}
if (value < 0 || value >= 128) {
return;
}
onSendData(sender, COMMAND_CH_CONTROLCHANGE + channel, function, value);
}
@Override
public void onMidiProgramChange(@NonNull MidiInputDevice sender, int channel, int program) {
if (channel < 0 || channel >= 16) {
return;
}
if (program < 0 || program >= 128) {
return;
}
onSendData(sender, COMMAND_CH_CONTROLCHANGE + channel, program);
}
@Override
public void onMidiChannelAftertouch(@NonNull MidiInputDevice sender, int channel, int pressure) {
if (channel < 0 || channel >= 16) {
return;
}
if (pressure < 0 || pressure >= 128) {
return;
}
onSendData(sender, COMMAND_CH_CHANNELPRESSURE + channel, pressure);
}
@Override
public void onMidiPitchWheel(@NonNull MidiInputDevice sender, int channel, int amount) {
if (channel < 0 || channel >= 16) {
return;
}
if (amount < 0 || amount >= 161384) {
return;
}
int hi = (amount >> 7) & 0x7f;
int lo = (amount) & 0x7f;
onSendData(sender, COMMAND_CH_PITCH + channel, lo, hi);
}
@Override
public void onMidiTimeCodeQuarterFrame(@NonNull MidiInputDevice sender, int timing) {
if (timing < 0 || timing >= 128) {
return;
}
onSendData(sender, COMMAND_MIDITIMECODE,timing);
}
@Override
public void onMidiSongSelect(@NonNull MidiInputDevice sender, int song) {
if (song < 0 || song >= 128) {
return;
}
onSendData(sender, COMMAND_SONGSELECT,song);
}
@Override
public void onMidiSongPositionPointer(@NonNull MidiInputDevice sender, int position) {
if (position < 0 || position >= 16364) {
return;
}
int hi = (position >> 7) & 0x7f;
int lo = position & 0x7f;
onSendData(sender, COMMAND_SONGSELECT, lo, hi);
}
@Override
public void onMidiTuneRequest(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_TUNEREQUEST);
}
@Override
public void onMidiTimingClock(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_TRANSPORT_MIDICLOCK);
}
@Override
public void onMidiStart(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_TRANSPOORT_START);
}
@Override
public void onMidiContinue(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_TRANSPORT_CONTINUE);
}
@Override
public void onMidiStop(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_TRNASPORT_STOP);
}
@Override
public void onMidiActiveSensing(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_ACTIVESENSING);
}
@Override
public void onMidiReset(@NonNull MidiInputDevice sender) {
onSendData(sender, COMMAND_META_OR_RESET);
}
@Override
public void onRPNMessage(@NonNull MidiInputDevice sender, int channel, int function, int value) {
int rlsb = 100; //lo
int rmsb = 101; //hi
int rlsb_value = function & 0x7f;
int rmsb_value = (function >> 7) & 0x7f;
int msb = 6; //hi
int lsb = 38; //lo
int msb_value = (value >> 7) & 0x7f;
int lsb_value = value & 0x7f;
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, rmsb, rmsb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, rlsb, rlsb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE + channel, msb, msb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, lsb, lsb_value);
}
@Override
public void onNRPNMessage(@NonNull MidiInputDevice sender, int channel, int function, int value) {
int nlsb = 98; //lo
int nmsb = 99; //hi
int rlsb_value = function & 0x7f;
int rmsb_value = (function >> 7) & 0x7f;
int msb = 6; //hi
int lsb = 38; //lo
int msb_value = (value >> 7) & 0x7f;
int lsb_value = value & 0x7f;
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, nmsb, rmsb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, nlsb, rlsb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE + channel, msb, msb_value);
onSendData(sender, COMMAND_CH_CONTROLCHANGE+ channel, lsb, lsb_value);
}
}
`
The SysEx message of Apple BLE MIDI specification is like this.
(Header) (Timestamp) F0 xx xx xx ... ... xx xx xx (Timestamp) F7
If the latter Timestamp value is F7
(same value as End of SysEx
), the SysEx data will be truncated.
F7
value.F7 F7
values arrived on the end of SysEx.MidiSystem
doesn't maintain Synthesizer
objects when MIDI devices will be connected / disconnected.
Thread.sleepより、も、
Object.wait(1000) <- notifyでブレイクできる、
を使われたほうがよろしいかとおもいます。
リズムが安定します。こちらではシーケンスの再生でリズムが安定することが確認できました。
待機がこれで
wait.java synchronized(lockObject) { try { whlie(!flag) { lockObject.wait(3000); } }catch(InterruptedException ex) { //アプリ終了でも、Interruptされたら呼ばれる break; }
報告側がこれです、
wait.java synchronized(lockObject) { flag = true; lockObject.notifyAll(); }
また、教えてください。Bluetoothでは、Thread.sleep(10)?が推奨されているのでしょうか?
考慮しなくてもうごいているのですが、
MicorosoftとかGoogleの実装とちがって、自分でチューニングできるので、使わせていただいています。^^
ところで、商用利用のライセンスってございますか?
Hi,
Not sure if this library is still maintained, but I've found a potential problem around how event timings are coordinated.
Specifically the code that starts here.
For example lets imagine I have a lastTimestamp of 8000 and a timestamp of 5000, i.e. there was 5192 ms between the last midi event and the current midi event. Now the condition of the if statement is false because 5000 + (8192 / 2) = 9096 which obviously isn't < 8000 (lastTimestamp) and so no adjustment is made to the current timestamp.
This is now an issue, because when the final result is calculated, part of the calculation is adjustedTimestamp - lastTimestamp or 5000 - 8000, which when we take the system time into account means we end up with an event timing that is less than the previous event, despite the current event taking place 5192ms after the previous.
I suggest all that needs to happen instead is we need to take the MAX_TIMESTAMP, subtract the lastTimestamp from it and add the difference onto the current timestamp to get the proper adjustment. E.g. 8192 - 8000 + 5000 = 5192ms, this can then be added onto the last system time recorded and we get correct event ordering.
Please let me know if I've missed something completely obvious, otherwise I am happy to put up a PR.
EDIT: BlueZ has a fantastic implementation of this here
PeripheralActivity and CentralActivity can't find midi device when I clicked start button.The log always show 'AudioTrack: isLongTimeZoreData zoer date time xx Seconds'
Apply the same modification at USB MIDI library's issue.
kshoji/USB-MIDI-Driver#44
I've tried testing with my ZenWatch, it can treat the BLE Central feature.
Add a synthesizer, or MIDI recorder function with a simple user interface.
Move library
to BLE-MIDI-library
.
But don't change the repository URL.
When the last MidiInputDevice
has been disconnected, this displays 1
(but should be 0
).
// implementation of OnMidiDeviceDetachedListener
@Override
public void onMidiInputDeviceDetached(MidiInputDevice midiInputDevice) {
Log.i("test", "Size of devices:" + bleMidiCentralProvider.getMidiInputDevices().size());
}
...
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.