A TV 4 times sharper than HD

Apr 26, 2012 By Troy Wolverton

Now that you've got a high-definition TV, you may want to start saving up for a super-high-definition one.

Television manufacturers, ever eager to shore up their business with , are gearing up to roll out sets with what's known as 4K screen resolution. These TVs, which should start to hit store shelves in the United States later this year, have about four times the resolution of 1080p screens, the current standard for high-definition sets.

Regardless of the size of its screen, a 1080p TV has about 2 million pixels arrayed across 1,920 vertical columns and 1,080 horizontal rows. Although haven't yet settled on a standard, 4K generally have at least 7 million pixels - and sometimes a lot more - arranged across about 4,000 columns and 2,000 rows. All those extra pixels allow 4K televisions to display images in much finer detail than HD-TVs.

I got to see some prototype 4K TVs at the in January and more recently at the offices of Marseille, a Santa Clara, Calif.-based chip company that has designed a 4K video processor. On bigger screen sizes at close distances, the difference between 1080p and 4K is stunning.

At a close viewing range, video on a big screen can look pixilated, and colors and images can blur into the background. By contrast, 4K video looks super sharp, almost lifelike.

Marseille showed me one demo comparing the two resolutions side by side. It reminded me of a similar demo I saw several years ago that compared the high-definition video stored on Blu-ray DVDs to the standard-definition video on regular DVDs. The difference was so great I immediately wanted to have the higher-resolution screen and video.

But most consumers will likely want to wait for 4K. The first 4K TVs will likely be outrageously expensive. Toshiba's 55-inch 4K television, for example, is already available in Japan for a cool $10,000 or so.

Marseille says its 4K video processor would only add about $10 to the cost of a TV or Blu-ray player. But that's not the big cost issue for manufacturers. Instead, it's the display technology, said Paul Gagnon, an analyst at NPD DisplaySearch, a market research firm.

The ability to cram that many pixels into a relatively small space is on the cutting edge of display manufacturers' capabilities, he said. Because they are still perfecting the manufacturing of such displays, they're likely to churn out a lot of defective and unsellable displays, which will ramp up the cost of the ones that can be sold.

"That problem that we see is that it's just super-expensive," said Gagnon, adding that in the near future, "We don't show that dropping that much."

Indeed, DisplaySearch projects that manufacturers will only sell about 5,000 4K TVs this year worldwide and won't sell more than a million per year until 2015.

The other thing that will likely give consumers pause is the paucity of 4K content. Consumer have a hard enough time finding 3-D content to view on newer stereoscopic TVs, but they will have difficulty finding anything at all in 4K.

That situation isn't likely to change anytime soon, Gagnon said. Manufacturers of Blu-ray discs aren't focused on 4K and neither are broadcast networks, he said. And the bandwidth required to stream 4K video would be enormous.

Even if those problems can be solved, some critics say consumers don't need or want 4K. If you watch TV on a smaller screen or sit across the room from it, you likely won't notice the difference between 4K and HD, they say. That conforms to my experience: the further away I was from Marseille's demo, the less noticeable were the differences between the 4K and HD images.

But the first HD sets faced similar hurdles. They were ridiculously pricey, and there was little HD content. And consumers unfamiliar with HD video often had a difficult time distinguishing between it and standard resolution video. Eventually, the industry solved those problems, and 4K could as well.

Marseille executives note that consumers already have plenty of 4K content in the form of digital pictures. Most cameras sold these days shoot at a resolution of 8 megapixels or better, which is right in line with what 4K televisions can display. And while it may be many years before your favorite shows are available in 4K, chips such as the one designed by Marseille can upscale HD so that it approximates 4K resolutions.

And consumer demand for higher resolution screens could grow. The average screen sold has gotten ever larger over the years. Consumers can now find and buy 70- and even 80-inch sets at relatively reasonable prices.

Meanwhile, Apple's latest iPhones and its new iPad have shown the benefits of displays with lots of densely packed pixels.

"As Apple demonstrated with the new iPad, pixel density is something people are interested in," said Gagnon, adding that it could "plant the seed for that 4K market coming down the line."

Explore further: Gift Guide: Strong photo, video gear options

More information: Troy Wolverton is a technology columnist for the San Jose Mercury News.

4.3 /5 (12 votes)
add to favorites email to friend print save as pdf

Related Stories

Epson's new 4K panel for 3LCD projectors

Nov 10, 2009

(PhysOrg.com) -- Seiko Epson Corporation has announced the world's first 4K panel for 3LCD (liquid crystal display) projectors. The panel will enable the projectors to produce a bright image of 4096 x 2160 ...

Is everything really better in 3-D?

Apr 29, 2010

From its heyday in the 1950s to its current resurgence, 3-D technology has gone from a cinematic gimmick to a box-office gold mine. James Cameron’s Avatar, heralded for its creation of a three-dimensional ...

Panasonic to show TV at Stock Exchange

Apr 18, 2006

Panasonic will be showcasing the world's largest high-definition plasma television at the closing bell of the New York Stock Exchange Tuesday.

Recommended for you

Ear-check via phone can ease path to diagnosis

Dec 18, 2014

Ear infections are common in babies and young children. That it is a frequent reason for young children's visit to doctors comes as no consolation for the parents of babies tugging at their ears and crying ...

Gift Guide: Home products come with connectivity

Dec 18, 2014

Do you really need an app to tell you to brush and floss? It seems every household appliance is getting some smarts these days, meaning some connection to a phone app and the broader Internet. But then what?

BlackBerry launches Classic in last-ditch effort

Dec 17, 2014

(AP)—BlackBerry is returning to its roots with a new phone that features a traditional keyboard at a time when rival Apple and Android phones—and most smartphone customers—have embraced touch screens.

Tag Heuer changes tune, now looking at smartwatches

Dec 16, 2014

Barely a few months after dismissing Apple's smartwatch, the new chief executive of luxury Swiss watchmaker Tag Heuer conceded Tuesday that such a hi-tech gadget might after all have a place in his firm's ...

User comments : 48

Adjust slider to filter visible comments by rank

Display comments: newest first

Noumenon
2.1 / 5 (7) Apr 26, 2012
I'm waiting for the super-duper HD
db4060
4.5 / 5 (2) Apr 26, 2012
Where's that volumetric holographic display we've been waiting for? How else am I going to plan my attack run on the death star? lol
kaasinees
1 / 5 (1) Apr 26, 2012
Quad-HD is old news though...
axemaster
5 / 5 (2) Apr 26, 2012
I've never quite understood, why are high resolution TV screens so expensive compared to the much sharper computer screens?
kaasinees
3 / 5 (6) Apr 26, 2012
I've never quite understood, why are high resolution TV screens so expensive compared to the much sharper computer screens?

Now compare the price to a monitor that is the size of a TV.
I just want a big ass monitor, i got my own hardware my own computer to put on the monitor. But its cheaper to buy a TV..
Eikka
5 / 5 (2) Apr 26, 2012
Over normal viewing distances, there's not all that great a difference between 720p and 1080p either until you go to display sizes way beyond 55". It's a limitation of the human vision, not of the television.

Computer monitors were heading towards 4K resolutions on the desktop before the whole "HD" craze. That actually made the technology take a step backwards because it was cheaper to make the panels in the same 16:9 1080p format instead of the 4:3 or 16:10 high res formats that are better suited for content production and consumption on the computer.

That's why e.g. the iPad screen is 4:3 - because it's a good match for a magazine or a book page in its proportions.
Deathclock
3 / 5 (6) Apr 26, 2012
I've been running 1080p on my computer monitor and TV for the better part of a decade now, 1920x1080 is all you need for display sizes up to about 32" if you are sitting close like a monitor, or up to about 60" if you are sitting further back like a TV. Unless you plan on getting a bigger display than those you don't need anything more than 1080p. Hell, my 720p projector looks as good as any TV at 110" when sitting back ~10 feet from it.
Deathclock
3 / 5 (6) Apr 26, 2012
I've never quite understood, why are high resolution TV screens so expensive compared to the much sharper computer screens?


Computer screens are only sharper at the same resolution if they are physically smaller. Resolution is half the equation, the other half is the size of the display. A 1080p TV at 50" will look as clear as a 720p monitor at 25" because the pixel density (pixels per inch) is equal (roughly) in those cases.

Pixel density is what is important, and I have never understood why that isn't what people focus on, how did it end up that people only care about raw number of pixels when that really doesn't tell you the quality of the picture?
rwinners
not rated yet Apr 26, 2012
No one has mentioned content. Where is it to come from?
dirk_bruere
5 / 5 (1) Apr 26, 2012
Ah, content...
Same old crap repackaged yet again for a new generation of suckers
Moebius
1.8 / 5 (5) Apr 26, 2012
This article is wrong about resolution like most articles. You need 8.3 million pixels just to double the resolution of 1080p.

1080 is actually the vertical. The resolution is 1920x1080. Just to double the resolution you need to double both those numbers and then multiply. Approximately 8.3 million.

The new TV is almost twice as sharp as HD, but not even close to 4x which would be about 33 million pixels.
rwinners
not rated yet Apr 26, 2012
Back to content. If one is not capturing (roughly) 8000 x 4000 dpi, then any attempt to display at that ratio involves interpolation and decreases the actual clarity of an image.
Deathclock
3.2 / 5 (10) Apr 26, 2012
This article is wrong about resolution like most articles. You need 8.3 million pixels just to double the resolution of 1080p.

1080 is actually the vertical. The resolution is 1920x1080. Just to double the resolution you need to double both those numbers and then multiply. Approximately 8.3 million.

The new TV is almost twice as sharp as HD, but not even close to 4x which would be about 33 million pixels.


They didn't say 4x, they said 4k, 4k is the name of the display format and it refers to 4k pixels wide... it's been around for a long time.

http://en.wikiped...solution
Deathclock
2.7 / 5 (7) Apr 26, 2012
Back to content. If one is not capturing (roughly) 8000 x 4000 dpi, then any attempt to display at that ratio involves interpolation and decreases the actual clarity of an image.


4k is roughly 4000x2000... I don't know where you got 8000x4000 from? Most movies are shot at 4k natively (some even at 8k), so there is your answer about content.

http://en.wikiped..._4K).svg
rwinners
not rated yet Apr 26, 2012
Actually, I'm pretty sure that pictures are still shot on film and then scanned at whatever resolution is required to do the job. Still, higher resolution means higher bytes/bigger files, at least if it is going to be used at the intended screen size.
Vendicar_Decarian
4 / 5 (4) Apr 27, 2012
I told you that 1080p was crap.
alfie_null
5 / 5 (1) Apr 27, 2012
Looking forward to this if it means we get higher density computer displays. I want a display that looks like a printed page. And at a decent price.
digitaltrails
not rated yet Apr 27, 2012
I think there was a study that found that if a movie was any good, viewers wouldn't notice the difference between DVD and Blu-Ray. Megapixel marketing hype transferred from cameras to TV's?
Birger
5 / 5 (2) Apr 27, 2012
The one good app I can see for this is high-resolution screens for telemedicine. Put a high-res camera in an operating theater and let an expert on the other side of the world "sit in" during surgery, without missing any detail.
Green_Dragon
not rated yet Apr 27, 2012
Computer monitors were heading towards 4K resolutions on the desktop before the whole "HD" craze.

Yea when everyone else was like zomg 720p!!! I was thinking I had 768p on win 98.
antialias_physorg
5 / 5 (4) Apr 27, 2012
I jut recently got a 720p Tv. sHit took me 4 years to save up for...


Where's that volumetric holographic display we've been waiting for?


You know, I'd be excited about such technology if there was anything worth watching on TV. But since there isn't...Meh.
Eikka
5 / 5 (1) Apr 27, 2012
any attempt to display at that ratio involves interpolation and decreases the actual clarity of an image.


A 1080p image isn't any less clear when displayed at 4K. It is exactly as clear as it is.

The difference is that when you upscale the image to a higher resolution, it gets a slightly blurry look because you're not looking at the aliasing errors produced by the pixel grid anymore.

It's similiar to how the high frequency hiss of a vinyl record makes music sound "crisp" and clear.

If you have two pixels side by side, with values 1 and 0, their information content isn't the sharp transition that happens half way in the middle - that's the artifact. It should actually be a smooth gradient between the two. Then there's also the fact that when the video is recorded, it is actually blurred down to 0.7-0.9 the physical resolution of the format to reduce moire patterns and jagged edges, so a single pixel of real information in the image is already spread over multiple pixels.
antialias_physorg
5 / 5 (4) Apr 27, 2012
If you have two pixels side by side, with values 1 and 0, their information content isn't the sharp transition that happens half way in the middle - that's the artifact

Not always. Sometimes it's really the sharp contrast that is intended. Especially in medical images aliasing (and whether or not it should be used) is a hot topic.
You can get misinformation either way.

A false sharp contrast due to resolution limitations could mean the difference between the diagnosis of a leasion and healthy, attached tissue - while a blur due to aliasing could mean the difference between healthy tissue and a (potentially false) malignant cancer diagnosis.
Deathclock
2 / 5 (4) Apr 27, 2012
Looking forward to this if it means we get higher density computer displays. I want a display that looks like a printed page. And at a decent price.


DPI of most printers is far less than most monitors, what happens though is that the ink bleeds together to form smooth transitions between the "pixels".
Moebius
5 / 5 (2) Apr 27, 2012
This article is wrong about resolution like most articles. You need 8.3 million pixels just to double the resolution of 1080p.

1080 is actually the vertical. The resolution is 1920x1080. Just to double the resolution you need to double both those numbers and then multiply. Approximately 8.3 million.

The new TV is almost twice as sharp as HD, but not even close to 4x which would be about 33 million pixels.


They didn't say 4x, they said 4k, 4k is the name of the display format and it refers to 4k pixels wide... it's been around for a long time.

http://en.wikiped...solution


They said in the title 4 times sharper, that's 4x and wrong, it isn't even twice as sharp.
Eikka
3 / 5 (1) Apr 27, 2012
Not always. Sometimes it's really the sharp contrast that is intended. Especially in medical images aliasing (and whether or not it should be used) is a hot topic.
You can get misinformation either way.


As per Nyquist-Shannon, you cannot record a frequency component greater than half your sampling rate.

That literally means that the maximum amount of information shown by a computer screen for an arbitrary image is half the amount than there are physical pixels on the monitor. Attempting any more is just adding noise that gives the illusion of detail, or it may be a lucky coincidence where a contrast edge just happens to align with the pixel grid - but you can't know that unless you already know it is like that.

I don't see the value of visual trickery when trying to show what exactly is and is not recorded in a picture. You shouldn't even see the individual pixels, because they provide you with no reliable information about the picture.
antialias_physorg
5 / 5 (2) Apr 27, 2012
I don't see the value of visual trickery when trying to show what exactly is and is not recorded in a picture.

Pattern recognition (by humans) does not rely on just a pixel and its neighbors. It relies on such fuzzy concepots as 'shape' and 'texture' and relational positioning - and these concepts become extremely fuzzy in the case of difficult diagnostic situations where the human eye/brain starts putting things together from barely visible structures/edges with parts missing, being obscured, hidden by noise or just broken.

Sometimes upping the contrast helps. Sometimes using an aliasing helps. There's really no hard and fast rule what is better in any given dignostic situation. This is ongoing research (which I was actively involved in).

We're not just dealing with the image - but more importantly what WE (or an algorithm) can see in the image ( a 'psychovisual' component if you so will).

The image isn't the important thing. The correct diagnosis is.
Eikka
5 / 5 (1) Apr 27, 2012
Of course, there are certain tricks you can do to improve the resolution at some other cost. A medical X-ray display device can be manufactured with no color filters to make use of the fact that there are three sub-pixels for every one pixel. The monitor then will obviously be just black and white.

You can do the same thing on a normal computer monitor with special software, but it reduces the accuracy of the colors. In fact, it is routinely done to render text, so the low resolution of modern day computer monitors wouldn't be so painfully apparent.
Eikka
not rated yet Apr 27, 2012
Sometimes upping the contrast helps. Sometimes using an aliasing helps.


What exactly do you mean when you say "aliasing"?

antialias_physorg
5 / 5 (3) Apr 27, 2012
To give you an example: An algorithm I developed for the classification of osteoarthritis in the knee uses two images reconstructed from the same CT dataset. One reconstructed with a 'soft' kernel (a 'smoothed' image) and one reconstructed with a 'hard' kernel (a 'high contrast' image). The soft one allows for shape analysis and structure recognition of a barely visible structure inside the bones while the hard reconstruction allows for texture analysis of the trabeculae.

Hard reconstruction renders more deatil - but it also emphasizes noise (and the above mentioned structure all but disappears in the noise). Soft reconstruction dampens noise at the expense of fine detail. It's really a traedeoff - and the hybrid approach led to success.

These are pictures where diagnose from 4 doctors would give you 6 diagnoses (just kidding...but a single doctor would likely as not diagnose two different severities given the SAME images over a 6 month interval for these patients)
Eikka
not rated yet Apr 27, 2012
One reconstructed with a 'soft' kernel (a 'smoothed' image) and one reconstructed with a 'hard' kernel (a 'high contrast' image).


Then we're talking about aliasing in an entirely different part of the display chain. You're talking about constructing the image out of a dataset, and how it should be done. I'm talking about what happens when you actually display the image on the physical device.

But, your problem is similiar to the problem in video production. You have a source video of higher resolution that you're supposed to downsample into 720p or 1080p and you have to decide how much anti-aliasing to apply. To get no added error, you should be blurring it down all the way to 0.7 times the target resolution because that's close to the Nyquist limit, but often as much as 0.9 gives a sharper appearance without you noticing that there's something wrong. The actual information content with the 0.9 image will be lower because of the added noise though.
antialias_physorg
4.5 / 5 (2) Apr 28, 2012
But, your problem is similiar to the problem in video production. You have a source video of higher resolution that you're supposed to downsample

Not really - even if the source produces images at the resolution of the screen it's sometimes advantageous not to go with that resolution (for the reasons mentioned, but also because of very subtle artefacts if the resolutions don't quiiiiite match)

Noise doesn't go away, though, when you do a blurring. You cannot add information to an image by blurring (and the more you blur the less information contained in the image. So a 0.7 image will contain less information than a 0.9 image. But information isn't the deciding factor in medicine (or in watching a DVD). It's (human/algorithmically) USEFUL information that is the deciding factor - and that can differ between applications (and humans).
Because here you are playing two factors off against each: information in the image vs. distraction from noise
daniel_ikslawok
not rated yet Apr 28, 2012
This is a 145 inch TV with 8000x4000 pixels, if I got it right:
http://www.youtub...oademail
bluehigh
1 / 5 (1) Apr 28, 2012
As per Nyquist-Shannon, you cannot record a frequency component greater than half your sampling rate.
- Eikka

I tend to agree. Not sure if its confidential or what but we had a bloke (some super math boffin, they wheel these fruitcakes in from time to time) explain to us recently exactly how we can acquire data faster than the Nyquist limitation. I objected with some support from colleagues and got told to relax and step back.

Crazy but the madmans method seems to work after some serious tests. I like Pineapples.

bluehigh
1 / 5 (1) Apr 28, 2012
Its essentially a missing pulse detector that enables the resolution of frequencies at more than twice the sampling frequency. One limitation is that any rate of change in frequency lags in resolution. Kinda like a really slow phase locked loop.
bluehigh
1 / 5 (1) Apr 28, 2012
I can see how the display of harmonics might enable a clearer picture. Kinda like having a great audio source and only have some poor response speakers for output but add 4X hardware and it all gets better. As for content ... more shows about pineapples would be great.

Eikka
5 / 5 (1) Apr 28, 2012
Its essentially a missing pulse detector that enables the resolution of frequencies at more than twice the sampling frequency. One limitation is that any rate of change in frequency lags in resolution. Kinda like a really slow phase locked loop.


Then it probably doesn't deal with Nyquist-Shannon in the same sense. It's not impossible to detect frequency components higher than the Nyquist limit. It just involves some trickery, and it potentially masks some other information you might want to recieve because you can't have any more actual information than what fits within the limits.

I think what your "madman" professor was talking about was simply something akin to heterodyne radio, where you're listening to the beat frequency of two signals - the difference of the sample rate and the measured signal. In video production, this would show up as a moire pattern.
Eikka
4 / 5 (1) Apr 28, 2012
Noise doesn't go away, though, when you do a blurring. You cannot add information to an image by blurring (and the more you blur the less information contained in the image. So a 0.7 image will contain less information than a 0.9 image.


You don't quite understand what I'm saying. The actual information content of the screen is (almost) exactly 0.7 times its X and Y resolutions because you need more than one pixel to represent one "point" of actual picture information. That is the actual resolution of the display, and not a "blurred down" version.

So when you anti-alias a picture - we're calling it blurring here - you're removing the frequency components that the display cannot transmit to the viewer, and which would cause artifacts to appear which would mask the real information present in the picture.

The real information content of the image drops when you add errors, and frankly I don't see how adding errors to a medical image helps make a better diagnosis.
Eikka
not rated yet Apr 28, 2012
For example. In the one dimensional case, the line OXOXOX contains 6 "pixels" To mark a single distinct point on this line, you need two transitions - a start and an end - so you can only put down three points on the line: the letters X. If you try to cram them any tighter, you don't know if OXXOOO means two separate points, or one fat point which is the only reasonable interpretation given the information we have there.

Your real resolution is 0.5 times the physical resolution, or half the sampling rate, and what looks like a fat point is an aliasing artifact. For a 2D picture, it's half the number of pixels, or in dimensions 0.7 x 0.7 which is 0.49 of the area.

If we were to draw a 2D image the same way, the shape that we get is actually indeterminate. It's really blurry, even though the pixel grid makes it seem like it has distinct edges. But those edges could actually be up to half the width of a pixel in any direction and they'd still fall in the same spot.
Eikka
not rated yet Apr 28, 2012
For Bluehigh, along the same lines of the example. If you see a pattern like this emerge:

OXXOOO, OXOXOO, OXXOOO, OXOXOOO...

And you assume that what you're measuring is a continuous periodic signal, then you can pretty much sum the different measurements up and take an average.

It's called oversampling, and every time you double the number of measurements, you add one more bit of information to what you got, so measuring the same thing twice is effectively the same as measuring the thing with twice the sampling rate, but half as often.

There are some limitations though. Like if you have a frequency at a multiple of your sampling rate, at the same phase, then it's going to show up exactly the same every time and you can't know what it is. The method relies on there being random noise that throws off your measurements slightly, like moving your head to see through a mosquito screen so you see the whole view behind and not only through the holes.
Eikka
not rated yet Apr 28, 2012
Which would btw. work well for antialias_physorg as well, if the image the doctors were looking at was to be continously generated from a higher resolution source, with a bit of noise added to it and maybe moving it slowly, because then the aliasing errors would flicker and change all the time, and only what's real would be consistent over time.

The human brain is good at averaging stuff, so you'd see through the noise and errors more easily.
simplicio
4.2 / 5 (5) Apr 28, 2012
1080 is actually the vertical. The resolution is 1920x1080. Just to double the resolution you need to double both those numbers

The new TV is almost twice as sharp as HD, but not even close to 4x which would be about 33 million pixels


This is not right. Doubling both axis is 4 times the resolution, as the article says. Doubling only the vertical pixels would be 2 times the res.

1920 by 1080 = 2,073,600 pixels (HD)
4000 by 2000 = 8,000,000 (4 times HD)

Also with more (smaller) pixels you get less brightness. So you must make pixels brighter which means more power to use (like new iPad, which can be hot to touch).
simplicio
5 / 5 (4) Apr 28, 2012
Actually, I'm pretty sure that pictures are still shot on film and then scanned at whatever resolution is required to do the job.

A lot still is. But movie industry is moving to digital video shooting more and more now (eg, Avatar, Sin City, Haywire, Total Recall (2012), Knowing, etc).
bluehigh
1 / 5 (1) Apr 29, 2012
It's not impossible to detect frequency components higher than the Nyquist limit.


Not my call. You will have to wait for the data.

Your understanding of harmonic modulation will change. We are using a pattern recognition algorithm. The data rate increase is low but detectable and well below half periods sampling. Magic?

.. and the moire patterns are removed with anti-aliasing.

The Avengers Marvel superhero movie is outstanding. Watch for The Hulk to punch out Thor.

Lurker2358
not rated yet Apr 29, 2012
Now that you've got a high-definition TV, you may want to start saving up for a super-high-definition one.


ha...yeah right.

The entertainment industry hasn't made the best use of what they've had lately.

Considering the advancements in cameras, televisions, and computer animation and other effects, both the movies and the television series for the past 12 years have been incredibly mediocre.

Fifty remakes and sequels that miss the damn point and fail to tell a compelling story.

Super hero movies that focus on all the wrong things, and fail to tell a compelling story, such as X-Men 3, and Iron Man 2, which was pretty much like watching an episode of Jersey Shore.

A Lord of the Rings trilogy that botched the story so badly it was almost un-watchable; only good thing they changed was leaving out Tom Bombadil.

Sure there's been a few gems, but the industry hasn't convinced me I need any such upgrades, because they didn't make good movies with the previous technology.
georgert
not rated yet Apr 29, 2012
Sweet, however, in my community I get exactly five HD channels through my basic cable service, and there's been no rush to increase those numbers.
Deathclock
1 / 5 (1) Apr 30, 2012
Sometimes upping the contrast helps. Sometimes using an aliasing helps.


What exactly do you mean when you say "aliasing"?


Aliasing is what happens when you try to represent a diagonal edge with a grid of square cells... instead of a smooth diagonal line you get a stair-step pattern.

Anti-aliasing, commonly used in video games, smoothly blends the edge into the background to reduce this effect.

http://en.wikiped...Aliasing

Doc_aymz
not rated yet May 02, 2012
I can't even see HD unless I move closer to the TV. Why would it need to be sharper except for maybe IMAX sized screens?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.