Jack2 : A Personal Analysis (Part #1)

Blog post by Barrett on Tue, 2013-07-02 12:40

Intro

In the last year i managed to play a bit more with the Haiku media_kit.
It was already discussed in various places if jack2 should be adopted, ported, or someway integrated into Haiku. There are various opinions out there, and more or less going down into the topic i want to show you what i think about.

Jack2 is a real-time audio server for UNIX systems, to be more specifical a smp focused reimplementation of Jack (which is single-threaded), it provides a protocol used by audio apps for inter-communication. It is gaining a very good number of apps supporting it and it's becoming interesting also for professional audio.

Besides GNU/Linux there are ports available for various operating systems, such as FreeBSD, Windows, Solaris and Mac OS X. The interesting thing about the OS X port, is the availability of a JackRouter which allow to route audio buffers between jack apps and core audio apps.

The idea is a lot similar to what the Haiku media_kit provide.
There are various differences, which includes jack lacking of video support and multiple formats.

Jack force every app to use 32 bit floating point values for their buffers, and the motivation of that is probably to prevent any latency due to resampling algorithms. This is very different from what the media_kit do, since in Haiku a mechanism is provided to support format negotiation and painless audio resampling.

The following text is a resume of some articles i wrote in my blog.

The Narrow Way

So, a day i decided to checkout the source and i started some exploration, to see how it's feasible a port of jack2, the resulting work is located at my github page and the haiku specific code is located in a subdirectory as other operating systems. Unexpectly it worked, trying to connect jack_rec and jack_simple_client produce the expected sinusoidal sound in the file.

Unfortunately excluding for now the noticeable lack of any media_kit backend, there are some other problems to be resolved before :

  • The jack shm implementation is not working as should (probably due to http://dev.haiku-os.org/ticket/2657)
  • Haiku is not supporting posix real time thread priorities (http://dev.haiku-os.org/ticket/8600)
  • I had to comment some memory locking related code, due to the lack of mlock/mlockall support.
  • Jack warn about lots of graph reorders.
  • jack_test is not passing some tests.

In the meantime, i've also done some brainstorm for possible solutions (in the same order) :

  • The best solution is probably to just fix the bug.
  • The idea is to replace the actual HaikuThread class (substantially a temporarily hack) and set
    it to inherit directly from JackThread instead of JackPosixThread, this would allow to use internally
    the Haiku implementation of threads, just emulating what jack is expecting.
  • Don't know if Haiku lack something preventing to have the system support them. AFAIK Haiku areas are obviously supporting this functionality, so it might be possible to just insert some ifdefs in the Jack implementation, or an Haiku implementation alongside with the POSIX and System V ones.
  • The latests two are probably related to the others.

So without an app to route sounds between jack and the media_server, we will still have
two separate worlds, since i see very difficulty to make jack2 and the media_server friends (since both are asocial animals).

I don't think this is a very good scenario to have and i don't see it haiku-stylish. But at the same time, i would love to see a hybrid jack2/media_server client, publishing a tunnel allowing to pass audio between the two servers.

Said that, my personal opinion is that reaching a stable and performant Jack2 could potentially require a lot of work for poor results, the amount of work might easily vary depending on the problems encountered in the steps. This was my feeling when i tried to create a simple jack capture node playing buffers using BSoundPlayer in the backend.

Rise And Shine

We are not lost, there is already another way, which *may* be better. The idea is as simple as difficult : force Jack2 applications to use the media_server by wrapping the jack API.

This consist of creating an ad hoc a set of code emulating the jack API but doing things using the media_server in the back-end. The idea is to have an application publishing BMediaNodes, those nodes will just wrap the functions needed by a client to process audio buffers.

Let me show some pratical examples :

jack_get_buffer_size
jack_get_sample_rate

Those could be trivially replaced using the info we have from the format negotiation done by the background node.

jack_midi_clear_buffer
jack_midi_event_get
jack_midi_event_write
jack_midi_get_event_count

I have never used the midi2_kit neither the midi_kit, but i bet the kits provides the needed functionality to emulate it. This is room for the next article (sorry!).

jack_port_get_buffer
jack_port_register
jack_port_type_get_buffer_size

Those functions are used by jack clients to get data out/in. When a BufferReceived() call is received by the BBufferConsumer the wrapper could theoretically call the process() callback the client has set using jack_set_process_callback function and at the same time provide the data in a stack which will be accessed by the client using the jack_port_* functions. Ideally a jack_port registration will result in a input / output for the media_node.

Those examples are taken from my personal research on jalv, and actually there are something like 20/30 functions in total to be wrapped for a theoretical layer allowing jalv to run as a media node. This is not a lot compared to big beasts such as Ardour, but enough to demonstrate that it's feasible.

My major concerns are about the API contained in the control.h header, and some other “minor” functionalities such as graph reorder callbacks, i’m not sure about how to emulate them. For the control API probably the best way is to do some conversion in background to make the BMediaRoster controllable by the jack control API. However, i need more research to evaluate how it’s feasible.

Additionally, taking this approach there would be other advantages, for example since the jack headers are LGPL, a layer of such type could be integrated into Haiku, more like what we already do with the VST media_addon or the freebsd compatibility layer for network cards.

A latency testing

In the end i just want to show you a little latency comparation i've done :

Buffer length Haiku (media_server)  Linux (Jack2)
1024 13.7 ms 42.7 ms
256 7.75 ms 10.7 ms

The results are encouraging, but i've not a realtime kernel in my debian linux, i'm pretty sure that with a realtime kernel jack2 and Haiku will be more close. I don't think jack and the media_kit can do a lot better, due to the hw limits of my poor integrated Intel HDA card. The values are catched from the system mixer in Haiku (using Cortex). For linux they are found in the QJackctl config window.

Hope it was interesting, i just wanted to say what i think/discovered in the hope that
it could be helpful to realize what's the state of things, and maybe make the way more clear for a developer which want to do this. Additionally, it may be that there's some error in my article since i'm before everyone in learning phase, so any correction/suggestion/opinion is appreciated!

Comments

Re: Jack2 : A Personal Analysis (Part #1)

Hi Barrett!

Thanks for writing up your thoughts on how we could port JACK to Haiku - it made for very interesting reading.

I too would love to see JACK for Haiku - MusE and qtractor are getting really good under Linux now and they would be even cooler if we could port them to Haiku, once JACK has been ported!

The biggest obstacle that you mention, if Haiku JACK is to compare at all with Linux JACK, seems to be the Haiku kernels lack of RT scheduling. Looking at the ticket, there doesn't seem to be anyone working on it either. How usable could JACK be under Haiku without RT?

Another worry over the viability of a Haiku JACK port at present is it seems Haiku has no support for USB audio devices yet - right?

I look forward to your reply and the next part of your analysis - thanks!

Re: Jack2 : A Personal Analysis (Part #1)

Hello : ), as you can see in the ticket comments...the Haiku scheduler have to be modified to support the posix rt threads, but i tried to emulate it setting the thread priority to a value greater than 100, which is the value usually set for RT threads under Haiku, one of the main advantages theoretically is that Haiku is already designed to be real time out-of-the-box, so probably overriding the POSIX implementation and writing a class using the native threads the problem could be tricked. Unfortunately the principal side effect of the Haiku design is the difficulty to port programs sometimes, but i see it fine provided the good purposes.

Anyway, when i tested jack2 under Haiku with the dummy driver, the behaviour was not so bad at all, considering it being an hack, xruns were quite rare (or zero, depending on the situation) and it was stable (no crashes at all), something usable for a initial port of an app.

Unfortunately usb audio cards are a big lack of the system, the oss port is lacking support too. Maybe at first look could be better to have a port of ffado, just because will give support for more devices. Anyway i think that apps are a priority before drivers...basically one is more involved to develop hardware when there are already stable apps which use it.

Re: Jack2 : A Personal Analysis (Part #1)

That's great to hear about the lack of xruns! I wasn't expecting that because am I right in thinking there is no proper backend for JACK on Haiku yet? Do any of the netjack backends work?

I know nothing about coding for Haiku but I would've thought getting USB(2) audio class compliant devices would be easier and get more people going. FFADO is pretty touch and go even under Linux and there are a number of USB2 CC audio devices known to work under Linux now, thanks in no small part to the huge success of the ipads and there increasing selection of audio apps. I own the Scarlett 2i4 which is USB2 CC and works great with ALSA and JACK.

Re: Jack2 : A Personal Analysis (Part #1)

Interesting article!
Would you know how the Jack2 latency is computed? On Haiku it depends at least on the audio frame rate and size for instance.
Anyway a fair comparison would need to measure latencies between an input and output signal on the audio card.

Re: Jack2 : A Personal Analysis (Part #1)

Well...i think both Cortex and QJackctl do an estimate of the latencies, this is not a measurement. As far as i can tell, for jack the formula is :

frames_per_period*period_count = latency (frames)

Dividing those values for the samplerate give you the time in ms, for example :

256*2/48.0 = 10.66

In Cortex instead, according to the code, the latency is calculated by the node itself, calling the BMediaRoster::GetLatencyFor() method, usually nodes calculate latencies by the sum of downstream + algorithmic latency, in the audiomixer case it's the snippet of code :

	// Now that we're connected, we can determine our downstream latency.
	media_node_id id;
	FindLatencyFor(dest, &fDownstreamLatency, &id);
	TRACE("AudioMixer: Downstream Latency is %Ld usecs\n", fDownstreamLatency);
 
	// SetDuration of one buffer
	SetBufferDuration(buffer_duration(format.u.raw_audio));
	TRACE("AudioMixer: buffer duration is %Ld usecs\n", BufferDuration());
 
	// Our internal latency is at least the length of a full output buffer
	// plus mixing time, plus jitter
	fInternalLatency = BufferDuration()
		+ max(kMinMixingTime, bigtime_t(0.5 * BufferDuration())) + kMaxJitter;
	TRACE("AudioMixer: Internal latency is %Ld usecs\n", fInternalLatency);
 
	SetEventLatency(fDownstreamLatency + fInternalLatency);

As you can read in the code, the calculation is the sum of :

media_destination_latency + buffer_duration + algorithmic_latency (NOTE the scheduling latency is ignored).

My feeling is that the Haiku's values might be more realistic than jack ones, expecially if the destination latency = audio driver latency, but at the same time not a lot comparable with the jack ones.

For jack there's a tool called jack_iodelay, it allow, connecting the soundcard input to the output, to measure real hw latencies. On the internet the rumor is that values calculated by qjackctl are inconsistent compared to empirical calculations done by the jack_iodelay tool. I think we need a similar tool under Haiku to say with some certainty what is the linux/jack hw delay and the Haiku/media_server one.

Additionally i can say another thing, the realtime threads have nothing to do with the latency reported by QJackCtl, since it's a prevision, whether the kernel is realtime or not is not taken into consideration.

In the second part of the article i will repeat those measurement with jack_iodelay...and a more detailed description of the differences.

Re: Jack2 : A Personal Analysis (Part #1)

I find then the presentation of the comparison table in the article misleading, putting side by side some numbers potentially unrelated... Other than that, I'm looking forward to the second part of the article :)

Re: Jack2 : A Personal Analysis (Part #1)

Sorry about that : ), i forgot to make more clear that the numbers were just a grab from the GUI, but a comparison is still interesting because those are numbers on which both systems rely for their functioning..anyway thanks for pointing out me on this!

Re: Jack2 : A Personal Analysis (Part #1)

Thank you very much for your investigation, I find in particular very interesting the compared latencies results, which, indeed, make porting JACK vs Media Kit's JACK skin choice a less blind choice to do.

Re: Jack2 : A Personal Analysis (Part #1)

phoudoin wrote:

Thank you very much for your investigation, I find in particular very interesting the compared latencies results, which, indeed, make porting JACK vs Media Kit's JACK skin choice a less blind choice to do.

The Haiku media kit numbers are interesting, right away we can see they're far too low to be even output-only latency at 48kHz with the stated buffer sizes. So what are they? I think, judging from the data I have that they're actually 192kHz numbers. The relentless "bigger numbers are better" marketing has meant that consumer audio chips tend to offer 96kHz or even 192kHz modes even though these are pointless at best and actively degrade output sound (due to a thing called ultrasonic intermodulation) at worst. The experimental patch Barrett used prefers 192kHz for whatever reason. With the same buffer size a higher sample frequency means lower latency and that would explain these (otherwise factually impossible) numbers.

So the Haiku media kit number at the top (1024 frames at 192kHz) is somewhat comparable to the JACK2 number at the bottom (256 frames at 48kHz) because they're both dealing with a buffer that corresponds to 5.3ms of audio, the slightly higher figure in Haiku is a consequence of the weird calculations done in the media kit and doesn't indicate anything is worse (or better) about Haiku necessarily.

That's interesting because if the Haiku media kit out of the box on consumer hardware runs with buffers of 256 frames at 192kHz reliably (no drop-outs, glitches, or weird media-kit specific buffer bloat problems) then it would presumably also run 64 frames at 48Khz, a far more useful set of parameters that is difficult to achieve out of the box with a Mac or a PC laptop running a Linux distribution. That would be worthy of more study.

Re: Jack2 : A Personal Analysis (Part #1)

I'm willing to bet my every possession that NoHaikuForMe is Paul Davis, Jack's author. I can spot him a country mile off!

Thanks for your input P!

:)

Re: Jack2 : A Personal Analysis (Part #1)

danboid wrote:

I'm willing to bet my every possession that NoHaikuForMe is Paul Davis, Jack's author. I can spot him a country mile off!

Since I am well compensated and have everything I need I will let you out of this rather awkward wager. I suggest that you should buy new spectacles though, a barnyard is not a goose, a goose is not a combine and I am not Paul Davis.