My question is not an audio question as such but highly I2S related. I have been trying to read this thread plus various RPi Linux sources back and forth, but there are still a couple of questions which would need an answer.
I am using the I2S interface to capture bit stream data from a measurement instrument. The instrument needs (or supplies) a fixed frequency (2..12 MHz) and outputs a single bit data stream at that frequency. I need to capture that data stream for further processing. So, in audio terms, I have a sound card which only has a stereo input channel (currently using 2 x 32 bits).
I have managed to patch up something that works quite reliably. I followed the AdaFruit guide at https://learn.adafruit.com/adafruit-i2s ... g-and-test
. There I was successful, but very unfortunately the infrastructure does not allow flexible sample rates.
To achieve the flexible sample rates, I took asoc-i2s-loader.c and renamed the codec and codec_dai. Then I took dmic.c and stripped it heavily, the most important change being changing the available rates to SNDRV_PCM_RATE_CONTINUOUS plus removing the multi-channel support. After a little bit of trial-and-error without really understanding what is going on, everything almost works with these two kernel modules loaded.
I am not very proud of what I have done. I have the lingering feeling that what I did could have been accomplished with some clever DT overlays without twiddling with my own patched kernel modules. I would very much like something that leans on standard components, as I really do not want to go through the steps for all kernel version changes. Is there such a way?
The AdaFruit instructions are a bit oldish, and a lot seems to have happened since.
Another very strange thing is that I have problems setting some sample rates. If I set a sample rate between approximately 150000 and 151500 samples/s (9 600 000 .. 9 700 000 bits/s), I always get exactly 9.6 MHz bit clock.
Any other bit rates are ok, but that small gap is very odd. I traced the issue with a frequency counter, but actually /sys/kernel/debug/clk/clk_summary agrees with me. The Alsa layer happily reports, e.g. 150 780 samples/s (9 649 920 bit/s) with snd_pcm_hw_params_set_rate_near(), but Linux' clock system gives PCM clock of exactly 9 600 000 Hz.
I tried to follow where the I2S clock is actually set, but got somehow lost in linux/drivers/clk/bcm/clk-bcm2835.c. However, it might be that the clock setting error happens somewhat earlier in the Alsa - I2S - Linux clk (clk_set_rate) chain. I would also be interested in seeing what kind of MASH parameters the clock divisor uses in this case, but cannot find where they are set. (I am not so much worried about the jitter of the 500 MHz clock, but the divisor jitter is of high interest.)
Any advice to either of my problems?