Melco. What hypothesis am I testing if I demo it?

Hi @badger1 I have no way of measuring changes to either Phase or Common node noise.

The designer, John Swenson, has provided a couple of Whitepapers on the subject.
However I can’t include PDF documents on this forum.
So have put them into a Dropbox link - Dropbox

2 Likes

Thanks should also be able to find them online

Interesting reads, take my head right back to degree days!

1 Like

Read it. I am still rather confused how the phase noise can possibly transfer from the network switch to the renderer / streamer in a well implemented upnp style architecture. In linn’s design, the renderer (my linn ds) is in charge of pulling data from the server. It buffers a significant chunk of the music data using asynchronous techniques. So everything is under the control of the streamer clock timing wise. Surely it’s only jitter or phase noise introduced by the renderer that is at issue. The network clock is not a significant factor. I must be missing something obvious or perhaps a well understood secondary effect? I am sure usb style designs might be different from Ethernet based upnp architectures. Would be interesting to get a perspective from @Simon-in-Suffolk

Also the common mode noise of course is still in play but the discussion on buffering not being helpful seems to refer mostly to common mode noise.

Because it’s the timing variation in the flow of symbols that are received by the end device. The end device recovers the clock from the changing symbols - what is often referred to as the serialisation clock… so the device knows what time a new symbol appears on the wires or fibre, even if it hasn’t changed. This recovered clocking drives analogue and digital circuity in the network stack. This clock will create a degree of EM noise. If this clock timing is moving subtly, that is through phase modulation, then a noise profile will be created in response to the symbol timings.
In an ideal world such EM is decoupled from other circuity and power lines, ground planes etc. In the real world there is always a little EM leakage.

2 Likes

Thanks Simon. Tried to reply yesterday but the site seemed to be down, for me anyway.

So if I understand you correctly, the clock activity and especially phase noise is generating common mode noise indirectly through the network components eg power supply, network cable etc. EM noise transmits to the streamer via the ethernet cable again like other sources of common mode noise, Different clocks and switches are likely to produce a different shape of noise depending on lots of factors?

Not common mode noise in the serial data stream, but the recovered clock will be modulated to some extent by the phase modulation in the data or symbol stream. Digital clocks and digital switching generate a noise profile from their EM emissions.
But yes the noise power and frequency distribution of the noise and the effects of such noise will depend on many factors yes.

1 Like

Got it - emi from clock recovery in the streamer… And presumably the act of buffering in the streamer should, assuming it is large enough, mostly mitigate the transfer of jitter artefacts that are in the data stream into the data stream fed into the dac by the streamers cpu and clock. The streamers clock then determines the timing for feeding data into the dac.

Well not really in the sense that this transport jitter or phase noise has no direct relationship with the DAC/ DSP clock and jitter from that, therefore the size of any buffer becomes irrelevant in terms of this. But you are right about the EMI.

1 Like

The noise is hard for a poor old software engineer to grasp but I am getting there :rofl::rofl::rofl::rofl:
Thanks again!

So you are saying the inbound network packet timing patterns can impact the output sound? So it can depend on how well the streamer faithfully re-constructs the data stream based on those timing patterns.

Not here, but historically I know interframe timing differences did affect the first gen Naim streamers.
No here I am referring to inter symbol timing… a more elementary than digital data… and that the analogue symbols are asynchronously sent using a serialisation clock

1 Like

Yes in the sense that the clock in the Ethernet card can emit electromagnetic interference that influences other components in the streamer like the power supply. Which in turn can influence the sound. That’s my understanding. How large is this effect of course is the question.

This is all independent of the asynchronous data transfer that is going on to get the data from the media server into the hands of the cpu in the streamer so that it can feed that data into the dac synchronously with the help of its own clock. That generates jitter and phase noise of its own but as simon says is totally independent of the network level noise.

Not sure that I get this? :slight_smile: This is new to me - “inter symbol timing”

Low level network stuff :grin:

How low can that be? :slight_smile:

1 Like

Well below by normal understanding level!

1 Like

It’s much easier to live in a world where the code just does what you tell it to do!!!

1 Like

Yes, on networks down at the physical level, digital data is not sent, instead analogue modulated signals are sent, called baud, and not bits which are combinations of analogue amplitude, phase, or frequency modulation of voltages or light. Particular analogue constructs of frequency, amplitude and phase are called symbols, and bandwidth is determined by the symbol rate. These when decoded in sequence can expand into a larger series of binary values or binary data.
So when some ill informed people say it’s only just ‘1’s and ‘0’s that are sent over network links they couldn’t be more wrong.

2 Likes

I see your point, so you are saying that the incoming baud rate/timings can impact some variables such as signal-to-noise ratio, bandwidth, and some modulation techniques, which in turn can impact the streamer/DAC?