Sign in   |  Join   |  Help
Untitled Page

ARCHIVED FORUM -- March 2012 to February 2022
READ ONLY FORUM

This is the second Archived Forum which was active between 1st March 2012 and 23rd February 2022

 

Avant 55 UHD 8bit or 10bit???

rated by 0 users
This post has 13 Replies | 2 Followers

Alex
Not Ranked
Essex
Posts 64
OFFLINE
Bronze Member
Alex Posted: Mon, May 1 2017 8:54 AM

Hi Guys

Over the past few days I've seen a few threads that specify that the Avant can only handle up to 8bit UHD

But on my Sky Q i have set the UHD to 10 bit with no problem at all via HDMI 1???

Should i keep it as it is or change it to 8bit? 

 

Aussie Michael
Top 25 Contributor
Melbourne, AU
Posts 3,730
OFFLINE
Bronze Member

if it aint broke...

:-)

Kiran
Top 100 Contributor
UK
Posts 1,023
OFFLINE
Gold Member
Kiran replied on Mon, May 1 2017 1:55 PM
Your HDMI cable should be 2.0 to handle 18gbps for the higher 10bit.

And then your TV needs to be capable of displaying the higher 10bit colour depth.

Like Michael says, if it ain't broke, don't worry.

8bit, 10bit and so on needs to be looked into when purchasing a matrix and sending the signal over Cat 👍🏻

I am doing this currently with Atlona matrix and using Cat6A capable of 40gbps 🔥

Regards

Kiran

Born in NL; I ride ML

Alex
Not Ranked
Essex
Posts 64
OFFLINE
Bronze Member
Alex replied on Tue, May 2 2017 12:40 AM

Yeah my sources are over an atlona matrix with CAT7 and seems to support 10bit fine

Best purchase ever!

Back to the Sky....... your right I'm not touching it!

Kiran
Top 100 Contributor
UK
Posts 1,023
OFFLINE
Gold Member
Kiran replied on Tue, May 2 2017 5:32 AM
You got balls installing Cat7 👍🏻

That cable is too thick for my house...

Regards

Kiran

Born in NL; I ride ML

Alex
Not Ranked
Essex
Posts 64
OFFLINE
Bronze Member

My B&O dealer sold me some, i didn't think it was that bad in terms of thickness

 

 

Kiran
Top 100 Contributor
UK
Posts 1,023
OFFLINE
Gold Member
Kiran replied on Tue, May 2 2017 9:24 AM

I don't know if you have seen my thread on my house being renovated. 

In the end, I will have 120 of those cables coming back to my server room lol. 

I was going to use Nexans orange Cat7. 

Regards

Kiran

Born in NL; I ride ML

Aussie Michael
Top 25 Contributor
Melbourne, AU
Posts 3,730
OFFLINE
Bronze Member
Kiran:

Your HDMI cable should be 2.0 to handle 18gbps for the higher 10bit.

And then your TV needs to be capable of displaying the higher 10bit colour depth.

Like Michael says, if it ain't broke, don't worry.

8bit, 10bit and so on needs to be looked into when purchasing a matrix and sending the signal over Cat 👍🏻

I am doing this currently with Atlona matrix and using Cat6A capable of 40gbps 🔥 Regards

Kiran

Born in NL; I ride ML

So Kiran I'd need a 2.0 cable for 10 bit?
ssbrig
Not Ranked
Posts 37
OFFLINE
Bronze Member
ssbrig replied on Tue, May 9 2017 6:29 PM

The 2014 Avant 55" model supports 8bit color depth only - it does NOT support 10 or 12 bit color depth.  See BeoCare email confirming this below.  

 

Does my 2014 Avant 55 inch support 10 bit or 12 bit color depth?

Response By Email (Ayrton Campos) (02/05/2017 04.47 PM)
BeoVision Avant is 8-bit.
 
Regards,
J.R>
Aussie Michael
Top 25 Contributor
Melbourne, AU
Posts 3,730
OFFLINE
Bronze Member
ssbrig:

The 2014 Avant 55" model supports 8bit color depth only - it does NOT support 10 or 12 bit color depth. See BeoCare email confirming this below.

Does my 2014 Avant 55 inch support 10 bit or 12 bit color depth? Response By Email (Ayrton Campos) (02/05/2017 04.47 PM) BeoVision Avant is 8-bit. Regards, J.R>

Thanks.
Kiran
Top 100 Contributor
UK
Posts 1,023
OFFLINE
Gold Member
Kiran replied on Mon, May 22 2017 8:13 AM
Aussie Michael:

So Kiran I'd need a 2.0 cable for 10 bit?

Sorry Michael late reply. Yes you would need a HDMI 2.0 cable coming from your 4K source to your 10bit TV. I don't know what's better about 12bit to 10bit but it will be more colour depth.

Should we have waited for 10bit?

Regards

Kiran

Born in NL; I ride ML

Millemissen
Top 10 Contributor
Flensborg, Denmark
Posts 14,680
OFFLINE
Gold Member

...and we need support for HDCP 2.2 (if we want to watch 4K and co in the near future/copy protection).

The story goes on and on and on and on....

MM

There is a tv - and there is a BV

w5bno123
Top 150 Contributor
London
Posts 578
OFFLINE
Bronze Member
w5bno123 replied on Mon, May 22 2017 9:58 AM
Alex:

My B&O dealer sold me some, i didn't think it was that bad in terms of thickness

There are two types of CAT7 B&O recommend, one is quite thick and meant for in-wall installation and the thinner is meant as product cable from the wall plate to the product. The problem with the thinner cable is that it is stranded and we have seen issues with 4K signals, you get sparkles on the screen. The thick cable has single copper cores and is more resilient, however it's unworkable as the bend radius is massive. I would use CAT6, in fact good old fashioned CAT 5e is fine!
Kiran
Top 100 Contributor
UK
Posts 1,023
OFFLINE
Gold Member
Kiran replied on Mon, May 22 2017 12:56 PM
Alex:

But on my Sky Q i have set the UHD to 10 bit with no problem at all via HDMI 1???

Now that I fully understand your setup, I suppose your Sky box goes into your Atlona matrix?

Then your matrix pushes the signal out via your CAT7 cable to your receiver?

Have you checked the display output settings in your Matrix? This, if I think I am right, should currently be 8bit...

Regards

Kiran

Born in NL; I ride ML

Page 1 of 1 (14 items) | RSS