Sign in   |  Join   |  Help
Untitled Page

ARCHIVED FORUM -- March 2012 to February 2022
READ ONLY FORUM

This is the second Archived Forum which was active between 1st March 2012 and 23rd February 2022

 

Putting my audiophile hat back on...

rated by 0 users
This post has 17 Replies | 2 Followers

John
Top 500 Contributor
Australia
Posts 321
OFFLINE
Bronze Member
John Posted: Tue, Jul 28 2015 10:07 AM

Of the many lessons I learnt in my 'audiophile' years, the main one was to trust science, logic, and reason, and to distance myself from audiophile pseudo science and made up rationalisations for 'differences in the sound' of a HiFi.

I can flatly state that nearly three years into my B&O ownership experience (V1-40 + BL9's) I have never been more satisfied or content with my system - my audiophile angst seems to have been banished for good... :-)

As such, I am reluctant to guess about some sonic changes I've noticed recently, and more interested in the solid science (if there is any) to define the reasons for the differences I've been hearing.

To wit, up until the last couple of weeks, for music, I have most of my CD collection ripped to ALAC in iTunes, and streamed over my distributed WiFi network to my Apple TV3, connected via HDMI to my V1-40.  (Movies are handled by a Bluray player/HDMI direct to the V1.)

However, despite experiencing only the odd very rare dropout when streaming music via WiFi, I decided to address some latency issues when streaming movies, by changing some aspects of the system from Wifi to Ethernet using Cat 6 cable.  

As the TV, and Bluray player were already connected to the network by Cat 6/Ethernet, I simply purchased and added Cat 6 Ethernet twixt computer and network (Billion modem/switch/VOIP/router providing ADLS 2 + service + routing + VOIP, in turn connected by ethernet to Bridged Apple Time capsule acting as WIFi base station), and from the network to the Apple TV3, hence effectively 'hard wiring' it.

Note that hardwiring the ATV3, turns it's WIFi off, and I also turned off WIFi on the computer, it too being now hardwired to the network.

I did not expect that there would be any sonic differences in sound, as from what I've read, digital is digital, zeros and ones are zeros and ones... and yet....LOL...

There is a subtle, but noticeable improvement in sound quality when playing CD's.  It's subtle, and I doubt I would pick it blind, using unknown/unfamiliar music, but with my favourite disks, I can definitely hear improvements, which is a great surprise as frankly I expected none.

I'm playing back my favourite disks at the same noted volume level, but perhaps the differences could be caused by placebo effect, differences in volume, expectation bias etc.

Unfortunately, a couple of weeks later, I'm still hearing the same improvements so I'm curious as to WHY.

I'm aware of the effects of jitter in terms of digital 'noise', but have no measurements to hand as to whether an ATV3 has less jitter with ethernet rather then WiFi connection.

About the only thing I can think of, is that the ATV3 by necessity with WiFi turned on, would be putting out a degree of RFI, which, being near the analog line level speaker outs from the V1 to the BL 9's, could in theory at least, add some noise to the analog signal.

Beyond that, I'm guessing... and I'm both curious and don't want to be an audiophile and 'guess'.

There are many here, with much greater technical expertise than me when it comes to digital, and I'm wondering if anyone might proffer some ideas as to why hardwiring over WIFi improves the sound of zeros and ones?

Many thanks all

Kind regards

John.. Smile

 

 

 

 

 

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

I guess this is not about being 'audiophile' or not.

It is just about doing things in the right way.

It has always been the advice from the guys in Struer to hardwire as much as you possibly can.

Digital is digital and zeroes and ones are zeroes and ones, yes! But the important thing is, whether these 'zeroes and ones' arrive at their goal just as they were send from the source!

It is well known, that a wireless connection is far more sensible to not-wanted influences than a cabled.

Then again - you also need a good shielded cable with proper connectors to do things right, when you hardwire.

Some people may 'hear' these 'unwanted influences', others not - or just don't care for it.

Sure these things can be measured - and have been measured.

The tech experts may answer to that.

I am just the guy, who want to listen to music as off the source - and I have choosen the secure path of hardwiring my device as far as possible.

 

P.S. Wouldn't it be more correct to say:

Hardwiring does not 'improve' the sound over wifi - wifi 'deproves' the sound over hardwiring.

You can't improve 'zeroes and ones' by any means, but you surely can 'deprove' things, if you don't treat the 'zeroes and ones' properly.

 

MM

There is a tv - and there is a BV

elephant
Top 10 Contributor
AU
Posts 8,219
OFFLINE
Founder
elephant replied on Tue, Jul 28 2015 10:40 AM
It could be that someone (Mac/PC versus ATV) is changing bitrates thinking that one path has more capacity than the other ?

I recommend dropping Philippe Robin (PHILONDON) a PM ... He knows far more under the cover technical details.

BeoNut since '75

elephant
Top 10 Contributor
AU
Posts 8,219
OFFLINE
Founder
elephant replied on Tue, Jul 28 2015 10:50 AM
MM, bits are bits. To detect signal corruption systems should implement systems like parity bits (weak) or CRC (stronger). If the signal fails the check you should get dropout. In rare occasions you might get a corruption's check matching a true check* -- in those cases you would get an anomalous sound or a scrambled signal.

Let's see what Philippe says.

* sounds strange/ impossible but as a23yo programmer I was castigated for writing bad code and causing a big mainframe test to fail. I knew I did not make mistakes (Devil) so I did a test check of the data and proved that the test data was "wrong" ... the data that had been randomly chosen to simulate a failure actually matched perfectly the check ! So I was vindicated Lets have a Party !!!

BeoNut since '75

John
Top 500 Contributor
Australia
Posts 321
OFFLINE
Bronze Member
John replied on Tue, Jul 28 2015 1:34 PM

Many thanks for the kind responses.  Whilst I don't know exactly why it sounds better hardwired, I'm quite sure that I'm not deceiving myself either, so it will be staying hardwired, and mainly for the sonic benefits.

Of course, this has also set off a bit of an upgrade bug, and I intend updating my 4+ year old Billion router, as it is getting a bit old - no longer supported with firmware updates, and has an annoying tendency to drop the ADSL line at odd times and then reboots itself - time for a new VOIP router methinks.

With respect to cable shielding, currently it's Belkin Cat 6 being utilised; my local B&O dealer uses all Cat 7, which he can supply, and I believe the main difference between Cat 6 & Cat 7 is the shielding.

So I will upgrade the cables as well in the short term.

Thanks for the recommendation of someone to ask re a PM of technical reasons - I really am quite curious as to the 'why' of the sounding better re hard wired v's WiFi.

Thanks again Milemissen and Elephant for the most kind responses

Kind regards

 

John... Cool

 

 

Geoff Martin
Top 150 Contributor
Struer, Denmark
Posts 672
OFFLINE
Bronze Member

elephant:
It could be that someone (Mac/PC versus ATV) is changing bitrates thinking that one path has more capacity than the other

My guess would be something in this ballpark as well. I wouldn't run down a rabbit hole filled with excruciating minutæ (such as jitter or the like) before making absolutely sure that your devices aren't using a different

  • transcoding
  • CODEC
  • Sampling rate
  • bit depth
  • something else 
as a result of changing from WiFi to Wired. It could be something really simple that toggles on and off as a result of the different signal path.

Cheers

-geoff

 

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

Obviously

MM

There is a tv - and there is a BV

John
Top 500 Contributor
Australia
Posts 321
OFFLINE
Bronze Member
John replied on Tue, Jul 28 2015 2:26 PM

Geoff Martin:

elephant:
It could be that someone (Mac/PC versus ATV) is changing bitrates thinking that one path has more capacity than the other

My guess would be something in this ballpark as well. I wouldn't run down a rabbit hole filled with excruciating minutæ (such as jitter or the like) before making absolutely sure that your devices aren't using a different

 

  • transcoding
  • CODEC
  • Sampling rate
  • bit depth
  • something else 

as a result of changing from WiFi to Wired. It could be something really simple that toggles on and off as a result of the different signal path.

 

Cheers

-geoff

 

 

Hi Geoff and many thanks for your thoughts

I played back at the same volume; files are CD rips into iTunes using ALAC; computer is iMac (Yosemite), Wifi is via apple time capsule - the only thing that really changed was that plugging in an ethernet cable into the Apple TV3 turns it's WiFi receiver/transmitter off; I also turned the Wifi off in the iMac via it's network settings as it was no longer required, as I had also installed an additional Cat 6 cable from iMac to network, so the signal path would be effectively all hard wired from computer to Apple TV3.

So the codec would be the same - and as it's the same file I would assume(?) the same sampling rate, bit depth etc.

I agree about the rabbit hole and the excruciating minutiae that one can get caught up in - been there a few times in years past, re 'Alice in Wonderland' audiophile experiences with my prior Naim Audio setup, and the 'rituals', tweaking, and faffing about that went with it - never again!... 

Either way, hardwiring sounds better and represents an subtle upgrade sound-wise for virtually no cost - am even more delighted than ever with my B&O kit, and only kicking myself that I didn't try hardwiring it earlier.. oh well, live and learn.. Surprise

BTW Geoff, loving your articles on your blog - please keep them coming... Smile

Kind regards

John... Cool

 

John
Top 500 Contributor
Australia
Posts 321
OFFLINE
Bronze Member
John replied on Tue, Jul 28 2015 2:26 PM

Geoff Martin:

elephant:
It could be that someone (Mac/PC versus ATV) is changing bitrates thinking that one path has more capacity than the other

My guess would be something in this ballpark as well. I wouldn't run down a rabbit hole filled with excruciating minutæ (such as jitter or the like) before making absolutely sure that your devices aren't using a different

 

  • transcoding
  • CODEC
  • Sampling rate
  • bit depth
  • something else 

as a result of changing from WiFi to Wired. It could be something really simple that toggles on and off as a result of the different signal path.

 

Cheers

-geoff

 

 

Hi Geoff and many thanks for your thoughts

I played back at the same volume; files are CD rips into iTunes using ALAC; computer is iMac (Yosemite), Wifi is via apple time capsule - the only thing that really changed was that plugging in an ethernet cable into the Apple TV3 turns it's WiFi receiver/transmitter off; I also turned the Wifi off in the iMac via it's network settings as it was no longer required, as I had also installed an additional Cat 6 cable from iMac to network, so the signal path would be effectively all hard wired from computer to Apple TV3.

So the codec would be the same - and as it's the same file I would assume(?) the same sampling rate, bit depth etc.

I agree about the rabbit hole and the excruciating minutiae that one can get caught up in - been there a few times in years past, re 'Alice in Wonderland' audiophile experiences with my prior Naim Audio setup, and the 'rituals', tweaking, and faffing about that went with it - never again!... 

Either way, hardwiring sounds better and represents an subtle upgrade sound-wise for virtually no cost - am even more delighted than ever with my B&O kit, and only kicking myself that I didn't try hardwiring it earlier.. oh well, live and learn.. Surprise

BTW Geoff, loving your articles on your blog - please keep them coming... Smile

Kind regards

John... Cool

 

Geoff Martin
Top 150 Contributor
Struer, Denmark
Posts 672
OFFLINE
Bronze Member

John:

So the codec would be the same - and as it's the same file I would assume(?) the same sampling rate, bit depth etc.

This is not necessarily a valid assumption. Although the CODEC of your original file sitting on your hard drive has not changed, it could be that the transmitter is transcoding into a different CODEC (or sampling rate converting it, or bit depth reducing it, or so on...) before sending it across the network. It could just be something like an error concealment (instead of error correction) in the system, for example...

I'm not trying to be coy here - the truth is that I don't know what Apple is doing with their system in this regard. I just wouldn't be surprised to find out that they're (or anyone else is) bit-reducing over WiFi using some sneaky transcoding, and this would be the root of your signal degradation. This would fall under a "Quality of Service over Quality of Audio" decision made by someone at some company...

Your pragmatism is smart - wired sounds better so use wires... Leave the "why" to someone who likes rabbit holes.

Thanks for your kind words regarding the blog. I just posted a new one today, and I'm already working on next week's instalment.

Cheers

-g

Geoff Martin
Top 150 Contributor
Struer, Denmark
Posts 672
OFFLINE
Bronze Member

Millemissen:

Hardwiring does not 'improve' the sound over wifi - wifi 'deproves' the sound over hardwiring.

 

"deproves" - I like that word. I think I'll use it...

Reminds me a little of a friend who complains that most people don't listen to high-res audio, they listen to vile-res audio...

Cheers

-g 

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

@Geoff

As a not-native-english-speaking-person I had to look that word up, before I used it.

Actually you can use it! 

Anything else, that I found, did not go well with 'improve':

'worsen, downgrade, loosen, break, fracture, destroy, ruin, spoil, undo, shatter, smash, deteriorate, decay, disintegrate, degenerate, fizzle, fade, rot, go bad, go to seed, decline, malfunction'.

Maybe 'disimprove' could have been used instead - but I find 'deprove' better ;-)
As you see - I learn a lot writing/reading on Beoworld ;-)))
MM

There is a tv - and there is a BV

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

Geoff Martin:

I just posted a new one today, and I'm already working on next week's instalment.

There we go:

http://www.tonmeister.ca/wordpress/2015/07/28/bo-tech-shark-fins-and-the-birth-of-beam-width-control/

MM

There is a tv - and there is a BV

Chris Townsend
Top 50 Contributor
Qatar
Posts 3,531
OFFLINE
Bronze Member
With reference to ripping to ITunes, I read this the other day and if I'm not wrong it doesn't really paint a very good picture on the quality. From post 6 onwards.

I read it as I'm trying to convince myself to get a Beosoind 9000Cool, and I want to send the best quality I can get to the 9's.

http://www.head-fi.org/t/658650/music-ripped-from-cd-with-itunes-vs-music-purchased-on-itunes-quality

Beosound Stage, Beovision 8-40, Beolit 20, Beosound Explore.

elephant
Top 10 Contributor
AU
Posts 8,219
OFFLINE
Founder
elephant replied on Tue, Jul 28 2015 5:25 PM
Millemissen:

As a not-native-english-speaking-person I had to look that word up, before I used it.

Actually you can use it! 

Anything else, that I found, did not go well with 'improve':

'worsen, downgrade, loosen, break, fracture, destroy, ruin, spoil, undo, shatter, smash, deteriorate, decay, disintegrate, degenerate, fizzle, fade, rot, go bad, go to seed, decline, malfunction'.

And now I am being taught English language subtleties Smile

Now I just have to find the right occasion for use

BeoNut since '75

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

Chris Townsend:
With reference to ripping to ITunes, I read this the other day and if I'm not wrong it doesn't really paint a very good picture on the quality. From post 6 onwards.

I read it as I'm trying to convince myself to get a Beosoind 9000Cool, and I want to send the best quality I can get to the 9's.

http://www.head-fi.org/t/658650/music-ripped-from-cd-with-itunes-vs-music-purchased-on-itunes-quality

Why not just rip to FLAC or ALAC. You don't need iTunes for that - there are better rippers outthere.

Especially if you want the lossless uncompressed quality, that can compare with a CD, you would have to pop into 'your' 9000'er.

Using lossy compression surely will make a diference - different encoders may even produce different results.

But a lot of people won't notice the difference between a FLAC/ALAC file and a file carefully ripped to 320 Kbps with a modern encoder. 

 

In general I would say - (for most of the music produced today) the important thing is not, whether we listen to an uncompressed file or a 256/320 kbps file (compressed with a modern encoder).

The important thing is how the original files were mastered (prepared for being used for the different formats).

It is easier to notice a heavy mastering compared to a more 'gentle' mastering, than to hear the difference between a lossless and a lossy (at 256/320) file.

Unfortunately we have very little influence on the mastering - we have to buy (as a download or a cd) or stream (from a music service), what the record companies offer.

 

interesting stuff - but we might have gotten too 'off-topic' with this.

 

MM

There is a tv - and there is a BV

Andrew
Top 100 Contributor
Frinton, UK
Posts 917
OFFLINE
Bronze Member
Andrew replied on Wed, Jul 29 2015 9:09 AM

I think hardwiring definitely results in improvements depending on what you are listening to it through - also not overloading devices - for example, to me anyway, streaming music to an Airport Express and then onto Cambridge DAC, sounds better than via ATV and the DAC even though the ATV is meant to be better - both  are hardwired but the AE does not drop out and is dedicated to just streaming so maybe that's why, playing straight form the mac into the dac again improves things - maybe it's just in my head and I feel better knowing that things are wired up the old fashioned way as far as possible. I use an old iPad rather than my iphone for streaming music, but do not have anything else running on it so that it can just concentrate on that. I suppose other variables will be speed of your wifi, signal strength etc and of course different recordings all sound different anyway.

Geoff Martin
Top 150 Contributor
Struer, Denmark
Posts 672
OFFLINE
Bronze Member

Chris Townsend:
http://www.head-fi.org/t/658650/music-ripped-from-cd-with-itunes-vs-music-purchased-on-itunes-quality

One has to read these kinds of comparisons with a grain of salt. There is no guarantee that the original master for the CD that one is ripping is identical to the original master that was submitted to Apple for sale on iTunes. So, a 1:1 comparison of a ripped (and possibly converted) CD to the "same track" purchased on iTunes (or someone else, or streamed, or on vinyl or whatever) is probably not a 1:1 comparison. It might be as different as comparing two different photographs of the same mountain, for example...

In fact, Apple has been accepting / requesting / recommending original masters submitted to iTunes as 96 kHz files for years. This means that, minimally, the same master was possibly either downsampled to 44.1 for the CD or upsampled to something higher for uploading to iTunes.

Whether or not the mastering (i.e. EQ or dynamic range compression) would be different for the two releases is dependent on how lazy/busy the mastering engineer was.

Cheers

-g

Page 1 of 1 (18 items) | RSS