Following on from another thread I’m curious as to how people choose to connect devices to their routers to stream catch-up TV services, Netflix etc.
Do you use ethernet or wireless and why?
Particularly if your router also forms part of high-quality audio streaming then it would seem prudent to connect video devices wirelessly to avoid possible interference on your LAN with noise pollution from these.
What do people think?
Is a wireless connection for video inferior to ethernet?
I use wifi to both my tv for Netflix/Now TV etc and to my 222 for music streaming - all works very well. I’ve read through the other threads and I find this whole subject very confusing. I’ve read threads on here and elsewhere and (assuming you have good reliable WiFi) I can’t see any consensus that WiFi sounds worse that Ethernet. I’ve tried both and can’t tell the difference - I read recently that the CEO of Auralic is suggesting WiFi is preferable if the WiFi module in the streamer has been designed ‘right’.
So, with all of that, I guess I can only understand the expenditure on fancy switches and all that where there is less than reliable WiFi - but it seems to me it would be better just to spend and get reliable WiFi.
As I said the whole subject confuses me but WiFi works well for me - both TV and music.
I’ve heard that before as well, though I couldn’t quote sources. Just as the implementation is critical to DAC quality I’d be surprised if it isn’t the same for network connectivity.
I have no problems streaming 4k content (multi gigabyte movies via Plex) over WiFi to Apple TV boxes. Beyond streaming stability I’m not sure how one would expect differences between WiFi and wired networking to manifest themselves.
I think in the early days of WiFi and streaming it was a lot less reliable. Connections were flaky etc. Perhaps this is what has lead to the widely held belief that ethernet is better quality?
I also can’t tell any difference between WiFi and ethernet when streaming video from the internet.
When we move to our last house, coverage from standard WiFi routers wouldn’t cover the whole house. I opted for a Ubiquiti setup that worked brilliantly.
In our current house, I needed WiFi to reach the generator installed in the back garden, so I upgraded to a WiFi6 long range access point and the UDM Pro router. It didn’t cost me much more than a lot of mesh routers do, with the added benefit that pre-owned Ubiquiti gear fetches decent money.
Can you tell the difference when streaming music?
I don’t stream any music over my network. I have a Melco library which is connected to a Chord Qutest DAC via USB.
The Melco is connected to the router via an EE8 switch - only to allow for remote control and album information download when ripping. Also connected are a Sony Blu ray player and Apple TV box. TV is connected via Wi-Fi.
I’m now thinking that it would be best to connect the Blu ray player and Apple box via Wi-Fi so that only the Melco is connected to my switch.
I occasionally got freezes when using the TV via WiFi, even though the Netgear WiFi hub was 2 meters away. I moved to ethernet mostly because it was easy to do, and now no stutters. However it was infrequent, so would have been happy with WiFi.
If might depend if you have other people in the house streaming at the same time, in which case go ethernet if its easy.
Given both setups are proper, for streaming video both will work fine.
Assuming you connect your Ethernet connection directly from your router, or you anyway have a distribution switch closer to the TV, it will:
- Provide rock solid connections, independent on other network traffic in our around your home.
Assuming, your internet uplink towards the provider is not congested; but if that’s a big enough pipe, video should make it through, with all the buffering and stuff.
- Consume a little bit more energy, than a direct client to router connection.
Assuming you have a simple setup and/or run a mesh network or repeaters any already for other reasons.
- Needs cables to be placed. (Usually a onetime investment.)
A WiFi connection, given decently modern WiFi equipment (WiFi 4+, better 5+ and 5 GHz band):
- Will be in itself totally sufficient.
- Unless, you hit one of the possible glitches of WiFi:
- Insufficient coverage from the router - distance (large houses), wall material, bad quality equipment or antenna.
- You have congested radio environment, e.g. due to tons of household co-members streaming at the same time, a lot of neighbors (e.g. in a city center I usually scan 30+ networks; I don’t want to know how this looks in a tower building with hundreds of apparments).
- You have any other disturbance - the 2.4 GHz range is used by a lot of other radio standards.
- You hit peculiar situations, like wheather radar avoidance, usually on 5GHz.
(I’m on that, and every few hours my router tries to optimize. Then my Apple TV looses connection for a few seconds. Usually this has no effect (due to buffering), but the “we lost connection” popup shows briefly.)
- Most of the situations are highly sitation dependendent and can be circumvented - by better / more modern equipment (router, client, repeaters, mesh networks), tunimg your radio setup (which radio bands to use, and for which device), put large aquariums out of the path, …) - effort will differ.
Trick is, for WiFi you will have to see if it works and then adjust as needed. Of course there’s very lear situations as well - a single detached house without interference and good equipment should always work
A cable will nearly always work out of the box, hence the tip to use it, when it’s easy to do so.
Disclaimer: this is the IT view as visible to streamer/decoder software. Disturbance will come, when connection is too slow or interrupted - it’s usually binary: works or not.
Audiphile view with microphonic effects, power supply variances, and other possible electrical effects ignored. (Assuming those won’t be noticable on any stream, with the encoding software and all the decoding software, image enhancment, etc. pp. making such effects negligable. We’re not talking lossless high-res audio on multi 10k equipment.)
Interesting- but it’s all rather deconstructing it all unnecessarily surely?
My view is just empirical - if it works with wi-fi then it works. If there are issues then clearly it doesn’t and one needs to revert to ethernet.
I use both but rely on Ethernet as WiFi hub is not reliable (Linksys). I find that Apple TV is better quality than the in TV system.
We stream catch up TV, Netfix etc. via apps on our Sony Blu ray player and Apple TV box rather than the apps on the TV. I haven’t bothered to verify it, but as both of these are known to be high-quality sources my feeling is that they are likely to out-perform the TV.
I stream netflix via ethernet. Offspring do online gaming or streaming films at the same time, and the only time there have been problems have been internet limitations. (I wired all rooms with Cat 6 to a switch beside the router when we first moved in, so it just made sense.)
Other than occasional sampling of new music, when quality isn’t important, I don’t stream music online. (Nor do I stream across the network, only playing direct from store/renderer to DAC via USB.
I adopt the mantra that if it’s possible to hard wire then you should hardwire. It’s keeps the increasingly congested wireless frequencies free for devices like phones and tablets that are wireless only.
My TV set-top box is wired via BJC ethernet cable to my router.
For audio, a BJC ethernet cable connects from my router to a Cisco 2960 - another BJC ethernet cable connects to my EE8, and from there a Chord (free) Ethernet cable connects to my ND5XS2.
No problem to stream 4k materials with Wi-Fi at home, especially now that I have installed a mesh network…
Wireless 4k streaming to my Sony Bravia, it used to be ethernet but after trying it on wireless it works just as well.
Although I’m ethernet at the moment I’ve streamed wirelessly in 4K and I can’t tell any difference in either picture or sound. So I’m reverting to wireless. Leaves my switch free for just my Melco.
Same here. I have a stable gigabit fiber internet connection and an Eero mesh router system. I have some wired TVs and one wireless (all the same Samsung model, but different sizes) and see no difference between wired and wireless connections. I’m sure it’s all about someone’s individual setup, though!
Indeed these days a wireless distribution network is preferable to using Ethernet everywhere …modern wifi standards such as wifi 5 and wifi 6 are a far cry from the older wifi protocols of yesteryear with a single wifi access point on the broadband router!… back then wifi was very much second best to Ethernet, but things have changed now in terms of efficiency and optimum sharing of radio bandwidth.
The optimum setup is a distribution of access points with ideally more than one access point being a base station AP.(Ethernet connected). The more APs you have the lower the power needed, the less contention and interference and the greater the throughput for more devices on your WLAN . Use Ethernet for APs or where you need to remotely power appliances like cameras, or where you need to pass your network through thick walls and other infrastructure … other than that best use a modern wifi standard and APs nowadays in the home for clients.
Ethernet in many ways should be avoided or great care applied where there are EMC and emmision aspects to be mindful of. Sensitive home audio springs to mind… it’s ironic that some audiophiles seem to suggest it… I think much of it has been based on legacy technology or poor wifi implementations in some older Hi-Fi equipment.