Talk:High-definition television/Archive 3
This is an archive of past discussions about High-definition television. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 |
Completely agree with Elastino
As far as aspect ratios are concerned, this page is completely wrong
You cannot list 1024/768 as "16:9"
Simple mathematics. 1024/768 = 1.333. 16/9 = 1.777
Same goes for 1280/1024
These are 4:3
So I have fixed it.
Very good! You are correct. Aspect ratios are very important especially to those that are learning the video trade. Very nice contribution! --WaynePacelle (talk) 08:48, 31 December 2008 (UTC)
High-Definition Display Resolutions Table
The "aspect ratio" entries on this table are wrong and non-sensical.
- Agreed. Having a video background, I knew what these figures should look like and they were all way off. In addition, the accompanying descriptions were confusing and downright misleading.
- Here is the original version: http://wiki.riteme.site/w/index.php?title=High-definition_television&oldid=187395227#High-Definition_Display_Resolutions
- Here is my revision: http://wiki.riteme.site/w/index.php?title=High-definition_television&oldid=187477170#High-Definition_Display_Resolutions
- And here is the current version (as of this writing) with Fernvale meshing the original back into the revision: http://wiki.riteme.site/w/index.php?title=High-definition_television&oldid=187477787#High-Definition_Display_Resolutions
- I spent a great deal of time revising both that table and the SD Display Resolution one so that they would actually make sense. I have referenced actual specifications where these definitions can be found and displayed it clearly and concisely. Since I would rather not participate in an edit war over this, here is a summary of everything wrong with the previous version that I have attempted to correct:
- "Pixel Resolution" is misleading. The term "resolution" does not specifically apply to pixels; lines are used in television systems and while the number of pixels often match the number of television lines in computer applications, it is not universal across all television sets.
- "Video Format" is misleading, because things like "WXGA" are not video formats but merely screen resolutions.
- "HD-Ready" by itself typically only refers to the television set and technically is not a video format either as it has no specification. The terminology is used to market non-standard displays while still retaining the "HD" buzzword, and as such it is questionable whether or not it should be retained in this table. I included them in my revision for the sake of completeness, but if they are to be kept, clear distinctions must be made.
- 1024×768 is NOT 720p. 1366×768 is NOT 720p nor 1080i. These are WXGA resolutions, and would be (if anything) 768p in the video realm, but they are rarely referred to as such as they are non-standard substitutes for native 720p displays.
- "Image Aspect Ratio" is unclear, as this could refer to any given property of the image. The three most common terms used in referring to aspect ratios are Source (or Stored) AR, Display AR, and Pixel AR. This column was renamed from "Display Aspect Ratio" for no apparent reason, and the values changed thereunder. Display AR is calculated as the the vertical resolution (width) divided by the horizontal resolution (height). Thus, XGA's DAR would be 1024/768, which equals 1.333... or 4/3. It is currently listed as "16:9" which is mathematically incorrect.
- Almost every HD and computer resolution Pixel Aspect Ratio is square (1:1). The exception here is HDCAM/HDV, which is anamorphic. The current version contains inaccuracies in this regard, and lists XGA, 1024×1080, and 1280×1080 as all anamorphic, (and omits the necessary Y value of "1" for that to even work) when it is not inherent to the resolutions and is applied, if at all, on a per-vendor basis. The PAR for 1366×768 is also listed as "1:1 Approx," which, even if it were correct, fails to specify what figure the approximation is derived from.
- "Used on X displays..." etc. - Video resolutions are not bound to any one technology. An LCD, DLP, or Plasma can all display however many number of pixels they are designed for. There are certain limitations with each technology, of course, but that isn't being addressed here (and probably doesn't belong anyway). This wording makes it seem as though the resolution is proprietary to the type of television.
- The current version links to same articles several times when only the first instance of a new term is necessary for this. It is also redundant in linking to both 1080p and FullHD which both direct to the same article, as well as 1080i/p (which could be used instead).
- 2160p is not yet an HD format of any kind beyond use in prototype/showcase televisions and does not belong in this table. In my revision I added a paragraph briefly mentioning super-HD formats, but this was removed.
- The "A common pixel resolution used in HD Ready LCD TV panels is..." paragraph is irrelevant to the topic and was previously marked as Confusing; I clarified this but it was replaced with the original text.
- I am hoping that this can be taken to a vote of some kind as it is very discouraging to see such inaccuracies go unattended on Wikipedia when many people out there are already confused about the nature of digital video to begin with. —Preceding unsigned comment added by 71.192.23.231 (talk) 02:04, 29 January 2008 (UTC)
- This table needs to be corrected for more.
- "Active Frame" incorrect -> "Active" is specifed by SAV/EAV bit, thus 720x480 for NTSC, 720x576 for PAL by ITU-R BT601-5, you might want to represent "Clean Aperature", or "Safety Area(frame)". Even though, you should use "5%", or should specify overscan ratio.
- "NTSC-M" nor "SMPTE 170M" does NOT specify active lines per frame. It only suggest VBI(Vertical Blanking Interval),which is 40 lines. Excluding 40 lines from 525 lines of Total, makes 485 lines, however it is NOT specifed in that way. Even, Start of Active Linenumber is different from manufacturers to manufacturers. (Eg. Company A assusmes that Line No.20 as Active, while the other use Line No.21)
- "4320:4739", "5760:4739" Where did you get this numbers? Is it derived from some equations? There is no such numbers in ITU-R BT.601-5 document, nor "ITU-R BT.709"
- Elastino 01:32, 22 September 2008 (UTC) —Preceding unsigned comment added by Elastino (talk • contribs)
- This table needs to be corrected for more.
European HDTV vs. North American HDTV
Is there any regional difference in the two region's HD signals as there is in SD NTSC vs. PAL? Or are HDTV sets simply global in standards? A 1080i-capable set in the USA could receive at 1080-i signal in Europe?
Europe still uses 25 and 50 frames per second, USA ~24, ~30 and ~60 fps.Totsugeki 06:55, 29 July 2007 (UTC)
This response doesn’t really answer the question posed. As there appears to be backward compatibility to earlier SD standards, the table of digital video resolutions for HDTV, implies that it will handle differing frame rates for both NTSC and PAL (24, 30, 60; 25, 50 progressive and 50, 60 interlaced). As manufacturers naturally seek significant economies of scale in production, is there any difference in the HDTV’s for sale in North America from those sold in Europe (apart from the tuners and power inputs, of course) So, if not, can a PAL sourced video or DVD be viewed on a North American purchased HGTV without any problems? 70.68.182.33 22:49, 20 August 2007 (UTC)
- It's all a fuzzy headache. I don't believe that there is any requirement that 50 hz work in the US, or 60 Hz work in Europe. It's likely that many manufacturers will do this, but you won't be able to count on anything, and will probably have to dig deep into manuals to see who supports what. It's UNLIKELY that you will see CRT based sets supporting the "foreign" field rates, because they would have to build different hardware to deal with each frequency. That is probably a primary reason we never got an agreement on a worldwide standard. LCDs and DLPs can switch frequency much easier. Algr 04:25, 21 August 2007 (UTC)
- Veto
- My 14 year old cheap TV supported Pal and NTSC. I was able to play on my Amiga Computer in both formats. AFAIR most cheap TVs from Asia did this, brand names did not. And also remember that PC-powersupply operates on various voltage- and frequencystandards. The OP is right with "economies of scale in production". —Preceding unsigned comment added by 84.188.77.233 (talk) 16:02, August 27, 2007 (UTC)
18 formats! So why do broadcasters ignore this and focus on two (720p60 and 1080i60)?
OK so maybe someone can explain to me why over the air broadcasters only broadcast 2 formats when their are 18 to choose from? Example: Cartoons and Film movies are both 24 fps so why when I watch the Simpsons on FOX do they transmit at 720p60 and I get interlace artifacts from the telecine? The same also goes for 1080i60. Why not just transmit at the native frame rate at with 1080p24? There is no equipment issue at the stations since this is all digital and the ATSC modulators doesn't care about the digital info they get as long as it isn't too much because the resolution and the frame rate are not integral to the ATSC modulator its just a bit stream. To convert everything to 1080i60 or 720p60 seams like allot of work and not necessary. If the flexibility was built into the system why ignore it on purpose and make things more complicated? Madzyzome 21:08, 15 June 2007 (UTC)
Answer: because Most TV's dont natively display 1080p, only the newest, so 1080i is the most common set right now.
- Native 1080i sets are very rare now. By far the most common HD sets are 768p and 720p. There are even some 1440x900p sets floating around for some reason. (...which the stores keep calling "720p". Algr (talk) 09:38, 29 February 2008 (UTC)
1440x900p and 1680x1050p are common computer monitor resolutions, add tuners (NTSC, ATSC and QAM) and they make great TVs too. I suppose they may call them 720p because more consumers know what that means and it's the only HD resolution supported without having at least 1080 lines. Gatortpk (talk) 04:07, 16 June 2008 (UTC)
NO! That did not answer my question. Didn't you read it? Obviously not. Please don't respond anonymously with useless answers, it takes up valuable bytes on the wikipeda servers.Madzyzome 04:39, 1 September 2007 (UTC)
Clarification to above: lesser displays (720p and 1080i) would display 1080p transmissions just fine, just downconverted (unless the television decoder is not compliant with ATSC). Totsugeki 06:57, 29 July 2007 (UTC)
- Why not 1080p24? Because most TVs can't display or accept 1080p. Why not 720p24? Depends on show source. Currently, a lot of programming on HD channels isn't from an HD source. Rather, a 480i source (taken from a D1, betacam or other tape)is upscaled to 720p or 1080i and transmitted that way. Also, consider what happens when you play a console or some other source and you change the resolution (especially through HDMI). Usually the screen blanks out for a little bit and comes back on. Something tells me that most of the TV stations want such an effect to happen as infrequently as possible, so they just find a frame rate that they can display all shows at and utilize the appropriate telecine filters and output everything at 60fps. This allows them to display all programming, regardless of source, on a continuous stream with no interruptions as the signal may change from 720p24 to 30 or to 60. Finally, the most important thing to consider is that TV is not a medium for the videophile, but rather the lowest common denominator. Most people don't notice said effects and even someone like me who's sensitive to video crap doesn't notice many of the effects of displaying 24 or 30 fps source at 60fps.Kakomu (talk) 19:14, 19 December 2007 (UTC)
OK there are several problems with what you just said. In order for a TV to be certified by the ATSC they must be able to receive all 18 formats as defined by the ATSC, 1080p24 is one of them (however displaying them in said formats is entirely up to the manufacturer). When changing from one format to the other the MPEG-2 transfer stream is not broken, so there is no blanking while the TV's internal circuitry attempts sync up, identify, decode, and demux the signal (however I am not sure this has ever been done before). Now I am not sure what you meant by "TV is not a medium for the videophile" what device would a videophile use is they wanted to watch say... a Blu-ray disc or HD-DVD? My main point was that with all the inherent flexibility that the ATSC put into their format, why do the individual stations stick with only one which requires them to scale the frame rate of different media in order to match theirs, it just seems like allot of work for nothing.Madzyzome (talk) 23:24, 7 February 2008 (UTC)
- (Re-clarified for Madzyzome's rant below) 720p and 1080i require disparate studio equipment (cameras, like the Panasonic HD930 is a 1080i camera, and the Panasonic AK-HD 931 is a 720p camera; editors; mixers; etc). Various networks have selected one of these formats for HD... e.g., ABC and ESPN (Disney) selected 720p; CBS and Showtime (Viacom) selected 1080i. (Some complexity results when a CBS affiliate decides to purchase 720p studio equipment and "downconvert" the network feed.) As to why 1080p24 is not one of the chosen two for HD, it has visible flicker (for some visual systems, subjective). As for 1080p60, it cannot fit in a 6 MHz channel using MPEG-2 compression (and it looked awful (to me, subjective) with MPEG-4 compression). Suggestion- consider decaf. -Dawn McGatney 69.139.231.9 (talk) 07:06, 22 April 2008 (UTC)
- Ok I don't think you understand the fundamental differences between different codecs. At the same bit rate, resolution and frame rate MPEG4 cannot look worse than MPEG2 since MPEG4 is a more efficient codec. I don't know why you think a 1080p60 signal can't fit in a 6 MHz channel using MPEG2. There is no technical reason why it can't. Any codec can look bad or good and it all depends on its efficiency, bit rate along with the resolution and frame rate of the source video. So if you say "and it looked awful (to me, subjective) with MPEG-4 compression" that is just ignorance of the process of video compression. To a layman, who has a limited knowledge of video compression, they see a HDTV broadcast and say "wow this MPEG2 video is awesome" then they see a webcast in MPEG4 a say "this video is awful quality" so they assume MPEG2 is superior in quality to MPEG4. This assumption, however, is completely wrong. Given the same bitrate, resolution, and frame rate, MPEG4 will always have a higher quality picture than MPEG2.Madzyzome (talk) 22:26, 8 February 2009 (UTC)
Wow you said a mouthful unfortunately your argument is full of technical errors. Are you sure they require disparate equipment? Since frame rate is not hardware dependent in a ATSC modulator as they are in NTSC, why would the ATSC modulator care whats on the transfer stream as long as its a bunch of 1's and 0's? Do you work at a TV station (on the equipment)? Are you familiar in anyway to the equipment used by TV stations to Modulate ATSC? When you assume, you make an "ass" out of "u" and "me". How do you know 1080p24 will flicker? Do movies at the movie theater flicker for you? Do you think 3:2 pulldown or telecene will help? I wont go into it but if the native format is 24p for movies and cartoons, they should just keep it that way. No one to my knowledge uses 1080i30, I suspect what you meant was 1080i60. You said "1080i60 cannot fit in a 6 MHz channel" well it can and does i.e. CBS, NBC et al, I think what you meant was 1080p60. The only problem is that any format can fit into the 6 Mhz channel even 2160p60 Quad HDTV, it all depends on the modulator, in the case of ATSC it uses 8VSB or 8 vestigial side band modulation and fits in about 24Mbs in the 6Mhz channel and because cable TV has a lower SNR (Signal to Noise Ratio) they use 64QAM (quadrature amplitude modulation), 128QAM or 256QAM and can achieve bit rates nearing 100Mbs which is plenty for several full HD channels, all in a 6Mhz carrier. Lastly how could something "look awful" in MPEG-4 than it would in MPEG-2 if MPEG-4 is more efficient? MPEG-4, and more specifically H.264/MPEG-4 AVC by Intel, is more advance and can achieve twice the compression at the same video quality. The only thing you said correctly was that analog TV transmissions will be over in one year. I guess 1 out of four isn't bad. Madzyzome (talk) 23:24, 7 February 2008 (UTC)
I do work in the industry and the company I work for is a leader in frame rate conversion. Broadcasters in North America did choose 1080I 60 (which is really 30 interlaced frames but for some reason we don't say field/frame anymore) and 720P 60 for several reasons. 24P formats DO flicker for viewers used to 30/60 frame formats, this is not technology but biology. Eventually you won't notice the flicker if 24 is all you watch. However if you switch back and forth from the 30/60 based to 24, you would always notice. Secondly it is much easier for North American Broacasters to up-convert NTSC programming to HD (either 720P 60 or 1080I 60) as you only need to change the number of lines per frame, not FPS. Not all Broadcast Plant equipment will work with the family of 1080P 24 formats (actually 23.98PSF is the most popular). Animators and cartoon producers do not want to use formats based on 30/60 as they would need to create 6 more frames per second. The tradional 'Film Producers' use 1080P 24 as a production fromat becasue they are familiar with it and feel it imparts a 'film-like' quality. To answer your original question, the Broadcast process always ends up in Master Control where the different sources are mixed together, the final step before transmission. You simply can't cut cleanly between different frame rates. Also, even Broadcast equipment that is designed to work in all formats will not jump between them in real time. I belive that the higher the frame rate, the easier it is for your brain to interpert the picture. Have hope, as soon as the Simpsons start using 'Vector Motion' based frame rate conversion, it will look a lot better. Yclept7 (talk) 22:39, 15 February 2008 (UTC)
- Yclept: No one since the 1930's has tried to display 24p without some kind of frame doubling to eliminate flicker. It makes no difference if this happens in the TV studio, (as would happen with an NTSC broadcast) or in the home, (as happens on most DVDs and 24p ATSC broadcasts.) 2:3 pulldown is an inherent part of any 24p format. so flicker is no reason not to transmit 24p. Hollywood LIKES the slight motion judder that is seen on anything shot on film or 24p video. Too much time resolution interferes with the suspension of disbelief. (See high motion) Apple has 1080p24 files on their web site that run at half the data rate of broadcast HD and yet look absolutely gorgeous - much better then most actual HD broadcasts I've seen. [1] So there is no way that their could be any technical problem with 1080p60 via MPEG 4. I expect you probably saw a bad demo somewhere once. Algr (talk) 01:41, 16 February 2008 (UTC)
- Broadcasters are not limited to "2 formats". 480i30 is legitimate (SD), along with 1080i30 and 720p60 (HD). "1080i60" is not an ATSC format. "1080p24" produces visible flicker (and migraines). And technically, there are not "18 formats" but 36... 720p60, 720p59.94, etc. -Dawn McGatney, 29 February 2008.
- AARG! There is no "Visible flicker" from any 24p format! Stationary objects can be rock solid, and film holds exactly the motion characteristics that movie directors like. Algr (talk) 09:38, 29 February 2008 (UTC)
- Right. How silly. Which is why motion pictures are shown in theatres at 48 frames per second. (Each film image is doubled.) See also NTSC analog TV, where 30 frames/ sec is interlaced to 60 fields/ sec to reduce flicker. -Dawn McGatney. —Preceding unsigned comment added by 69.139.231.9 (talk) 21:08, 1 March 2008 (UTC)
- Dawn, no real TV sold would flicker when given a 24p signal unless it was broken. What you are describing is PART OF the 24p format. The only way you would get flicker is if you built a non-standard display or decoder that lacked 2:3 pulldown or frame doubling. One exception: an LCD set would not need frame doubling because each pixel remains open passing light until it is instructed to switch to a different value. I don't know if LCDs actually do it this way, but it would work. Algr (talk) 08:54, 2 March 2008 (UTC)
- The actual LCD panel cannot display 24P due to internal Timing Controller PLL locking range. The PLL is designed to maintain around 60Hz(48~72Hz) for vertical rates. Thus, 1080/25p/25PsF/24P/24PsF should use 48Hz~50Hz panel driving rate. (or, tripple it.) Elastino 01:53, 22 September 2008 (UTC)
- Flicker is in the visual system of the beholder. -Dawn McGatney 69.139.231.9 (talk) 02:55, 11 March 2008 (UTC)
- "Flicker", and "Judder" are completely different. "Flicker" is visible when physicial(Viewer's) scanning rate goes lower than around 50Hz. "Judder", is caused by Frame Rate, not the physical problems. In other words, if you see "Still Image" with 1080/24PsF with 72Hz SONY CineAlta TrippleScanning monitor, you won't see flicker. If you use 48Hz mode with same monitor, you will notice some flickers due to CRT response time. Elastino 01:51, 22 September 2008 (UTC) —Preceding unsigned comment added by Elastino (talk • contribs)
- Flicker is in the visual system of the beholder. -Dawn McGatney 69.139.231.9 (talk) 02:55, 11 March 2008 (UTC)
- Right. How silly. Which is why motion pictures are shown in theatres at 48 frames per second. (Each film image is doubled.) See also NTSC analog TV, where 30 frames/ sec is interlaced to 60 fields/ sec to reduce flicker. -Dawn McGatney. —Preceding unsigned comment added by 69.139.231.9 (talk) 21:08, 1 March 2008 (UTC)
- AARG! There is no "Visible flicker" from any 24p format! Stationary objects can be rock solid, and film holds exactly the motion characteristics that movie directors like. Algr (talk) 09:38, 29 February 2008 (UTC)
- Broadcasters are not limited to "2 formats". 480i30 is legitimate (SD), along with 1080i30 and 720p60 (HD). "1080i60" is not an ATSC format. "1080p24" produces visible flicker (and migraines). And technically, there are not "18 formats" but 36... 720p60, 720p59.94, etc. -Dawn McGatney, 29 February 2008.
"HDTV is the answer to a question few consumers were asking."
That's NPOV how? The "citation" for that sentiment is a blog.
It'd be at least somewhat valid to say "A survey of consumers show that initial consumer demand for HDTV was minimal." Or whatever source; followed with a citation, of course.
SHDTV?
Hi there. I'm the owner of an HDTV website. As the owner and as a person looking for new content about HDTV, I was wondering what you guys know about SHDTV, or super high definition television. It seems very exciting. Thank you. —The preceding unsigned comment was added by Moneer189 (talk • contribs) 16:20, 14 February 2007 (UTC-8)
- There is already an article for 3840 x 2160 (2160p) and (7680 x 4320) (Ultra High Definition Video )Thewikipedian 12:56, 22 February 2007 (UTC)
Broken Links
There is a reference to the "Canadian Digital Television Offical Website", the link is broken! The link points to cdtv.ca, which is a registered domain, but pages from this site are not currently resolving.Audley Lloyd 22:23, 17 March 2007 (UTC) —The preceding unsigned comment was added by Audley Lloyd (talk • contribs) 21:20, 3 March 2007 (UTC).
The last external link, CEA'S HDTV Guide, is broken. 24.6.113.149 17:51, 26 October 2007 (UTC)
Removed the "HDTV vs film" section
I looked at this section for a long time, realized it had to be completely rewritten, started to, and then realized that it just shouldn't be. It's wrong:
1. The big ongoing debate in the TV/film world is HD vs film, not HDTV. It's about should we shoot on HD or film, but it's obvious it will be broadcast on HDTV. And on the other end of the chain, nobody is wondering whether they should install a film projector in their living room or an HDTV set.
2. Even movie theaters are not considering HDTV vs. film, but higher-quality, higher-definition formats (see digital cinema).
3. From my mediocre wiki understanding, comparing X and Y is good, but "X vs Y" already sounds non-reference.
Binba 05:46, 5 March 2007 (UTC)
Image choice
Why are there two very similar images of home cinema projection screens, and none of a standard high definition TV, when i'm sure the vast majority of HD TV viewers are only privy to the latter. Perhaps one of the images needs to be replaced? Yeanold Viskersenn 22:44, 9 May 2007 (UTC)
Agree, one of them needs to be swapped for a display using some other technology. --Ray andrew 00:55, 10 May 2007 (UTC)
Internet HDTV distribution
The text said: "In addition, 720p is used more often with Internet distribution of HD video". I see the oposite, all the HD videos I downloaded from the web are 1080i. see for example: www.mariposahd.tv .Vmsa 03:13, 23 May 2007 (UTC)
Link
I suggest add http://videos.howstuffworks.com/hdtv-video.htm to the external links section. --HybridBoy 06:03, 25 May 2007 (UTC)
- Neutral: Seems to be somewhat useful, but the full-motion ad that takes up half the screen is kind of annoying, and may violate WP:EL's prohibition against links with objectionable amounts of advertising. Leuko 19:29, 26 May 2007 (UTC)
Difference Interlaced / Progressive
The article incorrectly states Interlaced has a higher resolution and less frames than progressive. In fact it is the other way round 80.60.95.120 20:38, 29 May 2007 (UTC)
ahhhh.... please explain yourself. Are we supposed to take a statement signed by an IP address at face value? Pstanton 04:35, 8 July 2007 (UTC)
I will try to explain. 1. Resolution: 1080p -> 1080i - No higher resolution (on still image or parts). But half the bandwidth/bitrate. 720p -> 1080i - Higher resolution (on still image or parts) @ comparable bandwidths/bitrates. So the addition of "at similar/comparable bandwidths/bitrates" would be needed.
2. Movement rate: Interlaced frames hold two fields, and so field rate is double the frame rate. It's not common knowledge that the alternating fields can (and usually it does in television [at least in PAL] - except when broadcasting movies) hold different, running movement phases. This effectively doubles the movement rate, compared to a given frame/sec progressive mode material.
This is the true reason (except telecined movies) of the combing effect that occurs when such special frames (holding fields that belong to different movement phases) are displayed as like they where progressive, not interlaced, so both fields at the same time, which is a mistake! Proper deinterlacing must be done instead! (When done right, the result is comparable to a double frame/sec progressive material - just moving parts has a lesser vertical resolution [every other line is interpolated].)
(On a 50/60Hz CRT television, there is no combing because of interlacing, as it doesn't display both fields at the same time. 100/120Hz televisions differ in how they cope with the subject.)
ps. I don't know why this subject is completely missing on the "interlace page" of Wikipedia.
In other languages
since this article can't be edited, please change to [[pt:HDTV]] --UlissesCarvalho 13:11, 1 June 2007 (UTC)
- I have done that, ok. AxG @ ►talk 13:24, 1 June 2007 (UTC)
Mandated by Law
The article 2000s in television currently says "2009 in television - ... All Analog TV signals in the United States are mandated by law to be shut off and switched to HDTV format." If this is true, how come it's not mentioned in this article? Seems of pivotal importance to HDTV in general to me.--190.39.214.44 15:29, 16 June 2007 (UTC)
- There's no such law. The FCC couldn't care less what resolution you broadcast in. In February 2009, all channels will have by that point shut off their analogue (NTSC) broadcasts and be broadcasting solely in digital (ATSC). --69.123.165.15 02:50, 26 July 2007 (UTC)
- There is too a law. "Congress passed a law on February 1, 2006, setting a final deadline for the DTV transition of February 17, 2009. Most television stations will continue broadcasting both analog and digital programming until February 17, 2009, when all analog broadcasting will stop. Analog TVs receiving over-the-air programming will still work after that date, but owners of these TVs will need to buy converter boxes to change digital broadcasts into analog format. Converter boxes will be available from consumer electronic products retailers at that time. Cable and satellite subscribers with analog TVs should contact their service providers about obtaining converter boxes for the DTV transition."http://www.dtv.gov/consumercorner.html71.247.42.134 02:02, 15 August 2007 (UTC)
- The law is for converting from analog signals to digital signals not SD to HD. —Preceding unsigned comment added by 209.50.115.210 (talk) 18:44, 26 September 2007 (UTC)
- There is too a law. "Congress passed a law on February 1, 2006, setting a final deadline for the DTV transition of February 17, 2009. Most television stations will continue broadcasting both analog and digital programming until February 17, 2009, when all analog broadcasting will stop. Analog TVs receiving over-the-air programming will still work after that date, but owners of these TVs will need to buy converter boxes to change digital broadcasts into analog format. Converter boxes will be available from consumer electronic products retailers at that time. Cable and satellite subscribers with analog TVs should contact their service providers about obtaining converter boxes for the DTV transition."http://www.dtv.gov/consumercorner.html71.247.42.134 02:02, 15 August 2007 (UTC)
- There's no such law. The FCC couldn't care less what resolution you broadcast in. In February 2009, all channels will have by that point shut off their analogue (NTSC) broadcasts and be broadcasting solely in digital (ATSC). --69.123.165.15 02:50, 26 July 2007 (UTC)
PAL resolution wrong
The diagram and text show PAL DV as being 768x576 pixels. This is incorrect. It is actually 720x576 pixels. The 768 figure is an equivalent in square pixels, to compensate for the non-square pixels. It may be using when creting graphics on a computer for PAL display, but the actual resolution is 720 pixels (anamorphic pixel ratio 1:1.067).
NTSC Frequency
The NTSC frequency is 59.94Hz, not 60Hz ! This applies both to SD and HD. Strange nobody pointed that out.
can anyone explain why? the use of 59.94/29.97 as the ntsc field/frame-rate has it's history in the choice of 3.58MHz as the colour subcarrier frequency for ntsc colour encoding, which was itself led by the need for backwards compatibility with existing monochrome receivers & the separation between video & audio carrier frequencies in the transmission domain. but so why is HD, which is broadcast by digital means, still using these awkward frame-rates?
please tell me it's NOT for ease-of-use of HD video with SD systems....
Duncanrmi (talk) 20:39, 30 November 2008 (UTC)
- Compatibility. No one wants to sit down and edit 40+ years of video tape with a frame rate of 59.94/sec.--McGatney (talk) 15:06, 21 April 2009 (UTC)
Technical Details: is this correct?
Need some citation on "MPEG-2 is more commonly used". I understand that the HDTV systems in place in North America are mostly MPEG-2 based. This is unlikely to change soon as so many people have expensive hardware that would be rendered useless with an MPEG-4 switch... I know I wouldn't be happy! However, aren't the HDTV systems elsewhere in the world almost exclusively MPEG-4? Perhaps some metric is required here, comparing the number of installed tuners or broadcast channels for each standard, rather than a blaze statement that one standard is more common than another with nothing to back it up?
Link to The High Definition Guide?
I'd suggest adding a link to http://www.thehighdefinitionguide.com. This site seems to answer a lot of questions about HD.
Explanations are in clear english and the inclusion of a FAQ and forum gives Wiki users who want to find out more a place to go...
HDTV Sources
I don't see the Windows PC as HDTV source? E.g. my LCD HDTV has VGA input, so I watched many HD movies I downloaded from the BitTorrent. BTW the page is uneditable for me. Maybe because my IP address is from Netherlands.
The list item about HDMI and DVI interfaces covers the personal computer, be it one running Windows or any other Operating System. Alex53 (talk) 16:57, 24 June 2008 (UTC)
actual broadcast resolutions
The actual resolutions which HDTV is broadcast is actually a little lower. They broadcast a percentage less lines to save on expensive bandwidth and the image is merely scaled to fit your tv. It is still more lines than standard def, but the way this article reads you could be forgiven for thinking that the tv stations actually BROADCAST 720 lines JayKeaton 19:45, 30 August 2007 (UTC)
- Your mistaken, every line of 720p or 1080i is broadcast, perhaps you are thinking of overscan where your tv throws out some of the lines? --Ray andrew 00:00, 31 August 2007 (UTC)
Check the official broadcast specifications instead of the numbers that are displayed on the television when it tunes into HDTV JayKeaton 07:34, 31 August 2007 (UTC)
- Give me a link to any specification that says otherwise. Again I believe you are mistaken. --Ray andrew 13:28, 1 September 2007 (UTC)
I don't have a "link" to anything, but if you like I can email you a scan of a report from the hi def summit earlier this year. Also if you look at the specs of the equiptment major tv stations in the US use to broadcast you will see the same thing. The companies that make the broadcasting stuff can be found online. JayKeaton 17:18, 2 September 2007 (UTC)
- Just for the record. the broadcast standard is 1920x1080 pixels all of which carry picture data. However, an HDTV should only display the centre 1877x1000 pixels. 86.148.141.199 (talk) 16:59, 20 June 2009 (UTC)
Apparent bias in Advantages/Disadvantages
Why are there negative attributes listed in the "Advantages" section? I am disputing the neutrality of this article. --algocu 20:17, 19 September 2007 (UTC)
- I wouldn't call that bias, so much as disorganized writing. I never did like that "non-engineering terms" section. The best way to organize a wiki article IMHO is to have things in simple terms at the top, and then go into more technical detail further in. Algr 06:16, 22 September 2007 (UTC)
- The Disadvantages section, otoh, contains arguments which are not specific to HDTV, like the 16:9 format problems which are the same for e.g. WidePAL. Further, after studying the situation for quite a while, one big disadvantage is missing from this section, i.e. the lack of recordability on external media. To the contrary, the article goes on by stating that HDTV can be recorded on D-VHS and W-VHS. W-VHS is gone. D-VHS recorders have only input terminals (composite) which are incompatible with the output terminals on HDTV players or satellite receivers (HDMI). So, I assume this to be a real and significant disadvantage. Another disadvantage not mentioned in this section is that most known HDTV displays are only able to produce a good image when using HDTV signals. As long as there is still also PAL/NTSC to be displayed, these displays show problems in converting the SDTV signals into an acceptable image at the native resolution of the display. The only exceptions are CRT beamers and HD-Trinitron TV-sets from Sony which do not have fixed native resolutions. Thyl Engelhardt 213.70.217.172 14:14, 27 September 2007 (UTC)
- The writing in this section could be improved but I see no bias problems in there.--Rtphokie 14:43, 23 October 2007 (UTC)
4:3 image on 16:9
Who the heck wrote this?
Older films and programming that retain their 4:3 ratio display will be presented in a version of letterbox commonly called "pillar box," displaying bars on the right and left of 16:9 sets (rendering the term "fullscreen" a misnomer). While this is an advantage when it comes to playing 16:9 movies, it creates the same disadvantage when playing 4:3 television shows that standard televisions have playing 16:9 movies. A way to address this is to stretch the 4:3 image horizontally to fill the screen or reframe its material to 14:9 aspect ratio, either during preproduction or manually in the TV set.
This is bollocks. Why would anyone want to stretch a 4:3 image? Do people actually like things to look distorted? And "reframe" the material? In other words, chop off the top and bottom to fit 16:9? Why? Why is a 4:3 image on a 16:9 screen a "disadvantage"? A 16:9 screen size is meant to have flexibility, so that you can exhibit many different aspect ratios, such as widescreen 2.35:1, 1.66:1, and yes, 1.33:1, or 4:3. So what if there are pillars on the sides? If the Simpsons is 4:3, it should be viewed in 4:3.
The above section, taken from "Advantages of HD", is ignorant and certainly not neutral. Sladek 16:16, 16 October 2007 (UTC)
- Agreed. I removed the whole paragraph. To me, it looked like an argument against, not for, HD. The topic of 4:3 material displayed on 16:9 screen is already covered under the heading of "Disadvantages..." Binksternet 17:33, 16 October 2007 (UTC)
My point is not that a 4:3 image on 16:9 having unused screen on the sides is inherently a disadvantage. Again, why is it a disadvantage? This is subjective.
Many consumers aren't satisfied with this unused display area and choose instead to distort their standard definition shows by stretching them horizontally to fill the screen, giving everything a too-wide or not-tall-enough appearance. Alternately, they'll choose to zoom the image which removes content that was on the top and bottom of the original TV show.
What is the consumer obsession with filling the screen, to the detriment of the image? There are many, many things that do not have 16:9 aspect ratio, and there are many people that see no problem whatsoever with "unused display area". Sladek 15:01, 18 October 2007 (UTC)
- Because the "side-bars" are static images, they can eventually "burn-in" on many plasma screens. -Dawn McGatney 69.139.231.9 (talk) 04:37, 15 March 2008 (UTC)
My dad's TV in the UK automatically stretches 4:3 pictures to fill the 16:9 display. There's an option to turn it off. The centre of the screen isn't stretched, just the edges. It's noticable if there's lots of straight lines, e.g. rail tracks, power lines, edges of buildings at the edge of the screen, since they become wonky lines. —Preceding unsigned comment added by 78.86.151.120 (talk) 19:05, 13 January 2008 (UTC)
Here in Malaysia we don't broadcast TV signal in 16:9 and yet wide screen TV's and projectors are common and importantly to the arguement are seen as luxury items. People buy them and change the aspect ratio to obtain a full frame picture regardless of the distortion. While I agree with Sladek that it is a crime to distort the aspect ratio when watching a broadcast we have to realise that is our personal opinon and we are the minority. I've worked in TV for 10 years in 3 markets (Australia, UK, and Malaysia) on the programming side of the business and I've seen the public do this in all markets. I believe the edited comment is accurate.Expat Justin (talk) 04:09, 24 March 2008 (UTC)
Distorting the aspect ratio happens a lot lately, especially when the footage of different sources with their own different aspect ratio are brought together in one broadcast or computer video file. I figured out that most people either don't care or don't even notice. But IMHO, a movie / video fan should care about it. Isn't there some sort of a technical committee which supervises or controls the compliance with the correct ration? White rotten rabbit (talk) 08:03, 18 February 2009 (UTC)
Suggested external link: Xbox Live Video Offerings In HD
I'd like to add a link in this article's External links section that points to a page I've maintained for some time now that lists all the HD programs that are available for download from the Xbox Live Marketplace service: http://kplusb.org/xboxlivehd/. Does that link seem like it might be useful to Wikipedia users who read this article? Bryan H Bell 19:12, 18 October 2007 (UTC)
- Since there has been no objection so far, I'm adding the link. Bryan H Bell 20:50, 27 October 2007 (UTC)
Removed opinion reference
Someone trying to be clever added "HDTV is an answer to a question few viewers were asking" and added a reference to a blog in which a librarian pontificates about digital issues. Not only is that statement non-factual (lots of people wanted to know when HDTV was coming) it's "source" is not primary information and does not address the assertion made in that statement. Its not the purpose of an encyclopedia to provide commentary or "analysis" a la a newsmagazine. --papamurphy
Early HDTV (French 819 line 755i)
Could the French 819 line monochrome system not be considered an early form of HDTV
- Not really. Because it was interlaced, this system's theoretical best picture would have been clearly inferior to 720p. And of course it was black and white 4:3 only. In practice, the tube technology of the day meant that actual sets and broadcasts fell well below the quality that the signal could theoretically have delivered. Also, I wonder how many people actually owned such sets? Was it like the CBS color sets in the US in the 1950s? Algr (talk) 04:52, 21 November 2007 (UTC)
- 1080i video is interlaced as well, yet is still considered HD. Rather, the defining features of HD appears to be twofold: resolution greater than the old standards (480i) and the fact that it's a digital source. This would exlucde 480p, because the resolution isn't any higher and would exclude the 755i source mentioned because it's not digital.Kakomu (talk) 19:20, 19 December 2007 (UTC)
- I wasn't saying that interlace makes it not HD, but that with interlace you need more lines to get the same picture quality. For example, 480p plasma displays can look far better then 480i tube sets. Digital is not necessary for HD. Japan had an analog 1035i HDTV system since the 1990s that yielded excellent results, and failed mainly because it was too expensive. Algr (talk) 20:04, 21 February 2008 (UTC)
- 1080i video is interlaced as well, yet is still considered HD. Rather, the defining features of HD appears to be twofold: resolution greater than the old standards (480i) and the fact that it's a digital source. This would exlucde 480p, because the resolution isn't any higher and would exclude the 755i source mentioned because it's not digital.Kakomu (talk) 19:20, 19 December 2007 (UTC)
"with interlace you need more lines to get the same picture quality"? please clarify/support this statement. I thought the number of lines was the same in 1080i & 1080p, & that the difference was largely in how they deal with motion artefacts.... so material shot as 1080i/50 (or 60, or 59.97..) native & viewed on a 1080i display is not going to look significantly different from material shot as 1080p25 (or 30 or 29.94) viewed on a 1080p display. it's only when dealing with format conversion (telecine, up-conversion of legacy material &c) that the problems start.... the real issues that force a choice one way or the other are to do with availability of this legacy material in suitable forms, the market penetration of suitable displays, & the bandwidth required to deliver a meaningful (but necessarily quite compressed) version of the chosen format.
the EBU is recommending 720p to european broadcasters in place of 1080i because (subjectively) it offers similar picture quality for a 20% saving in bandwidth (storage & delivery). but those of us wanting to up-convert old interlaced material are favouring 1080i because of motion artefacts. I'm quite happy to be proven wrong- but only if it contains real science & not dick-waving opinion. :-) Duncanrmi (talk) 20:51, 30 November 2008 (UTC)
- The French 819 line system would indeed have been, by today's standards, a high definition TV system. Since the source material was interlaced, it would have been slightly better than 720p. Or it would have been had it not been for a number of problems. 819 line required a massive 14Mhz of channel bandwidth (compared with the 3 MHz that the 405 line system required). The technology of the day wasn't really up to providing this - at least not over any real distance. The second problem was the technology of camera tubes and CRTs at that time was just not up to the job (but would have been just a few years after 819 was abandoned. In practice, many implementors of the 819 line technology limited the bandwidth to 7Mhz (as the newly introduced 625 line system required about 6.5 MHz) without any real significant impact. 86.148.141.199 (talk) 16:55, 20 June 2009 (UTC)
Wording Issue
"In 1969, NHK of Japan first developed commercial, high-definition television.[4] However, the system was not commercialized until late in the 1990s."
This is a completely paradoxical statement - If it was the first commercial HDTV, then it was, by definition commercialized.
A more appropriate phrasing would be "In 1969, NHK of Japan developed the first commercially-viable, HDTV.[4] However, these designs were not fully utilised commercially until late in the 1990s"
but plainly it was NOT commercially viable. I don't think it was the "first" anything. "In 1969, NHK of Japan developed & demonstrated a high-definition television system comparable with modern HD-TV. However, commercial use was not practicable until the late 1990s." better? Duncanrmi (talk) 21:01, 30 November 2008 (UTC)
"TV of the Future Today?"
there is a lot in this article that reads like a brochure for some company ad. We need to get rid of that, hence I cited this for npov and cleanup. I removed some of the more offensive stuff down in the "equipment section." -Signed by Scryer_360, lazier than a Cadillac XLR in the turns. —Preceding unsigned comment added by 153.91.137.171 (talk) 06:19, 25 January 2008 (UTC)
In fact, I just went ahead and deleted the whole "HDTV equipment" section. Its a good idea to mention that people will need more than just the TV and standard sat-box to get HD, but I'll be damned if I let that corporate sounding trash reek all over wikipedia. I mean hell, it was talking about the "HD Experience" and how "buying your new HDTV can be a great experience." Firstly, Wiki articles never address the reader as "you." They never address the reader at all, this is a catalog of knowledge not a self-help book. Secondly, only corporate whores at BestBuy talk about HD Experiences. What the hell is that. I know its going to be a way to experience something, hell, thats what doing something has as baggage, you experience it. I do not need more badly written brochure material to tell me that.
Signed by an increasingly angry Scryer_360, who doesn't understand why corporations want to help Americans get dumber by writing sections like the old HDTV Equipment section. —Preceding unsigned comment added by 153.91.137.171 (talk) 06:24, 25 January 2008 (UTC)
Suggest this site with in-depth technical format info
Suggest adding this link
As it provides much additional information not in the article (but too detailed for the article).
61.19.65.197 (talk) 15:44, 31 March 2008 (UTC)
Context, context, context!
I think the problem with the HDTV piece is that it doesn't provide enough context. If you only talk about the nightmare of conflicting standards, including offsquare pixels and seemingly insane frame rates such as 29.97, no one can make any sense out of it. The reader just starts thinking: What *is* all this?
Some of the reasons are rooted in technology, and some in politics. I realize this quickly becomes controversial, but I see no escape from it.
Technology: Video, until recently, was a broadcast medium, meaning, "over the airwaves." This imposed very significant technical compromises, such as the transmission of interlaced pictures. Cathode-ray tubes were of such low definition, no one could notice motion artifacts caused by interlacing anyway--but when you see interlaced video on a sharp computer monitor, the "combing" effect is extremely noticeable, especially if you capture one frame. Why is it still there? Because bad habits are hard to get rid of. When broadcasters were confronted with the prospect of HDTV, they fought very hard against progressive scan, because it would have consumed much more bandwidth. The advent of compression technology had created an exciting opportunity for them to actually squeeze more channels in the old bandwidth, if picture quality was not improved; and of course, more channels equals more revenue. This is the real reason why interlacing is still with us: A simple desire of broadcasters to maximize revenue. Most of them refused to believe that viewers wouldn't notice the difference, and who knows, maybe they were right.
I am not speaking anecdotally. Years ago I wrote a very long piece for Wired magazine ("The Great HDTV Swindle," not my title) for which I did interviews with TV station owners. Their contempt for their audience was rather surprising. If they had been in the business of making computer systems, we'd still be stuck with 640 x 480 video monitors displaying 256 colors! They absolutely refused to believe that a significant number of viewers would pay more money for a picture that was better than plain old NTSC. And even now you see the broadcaster mentality affecting image quality. Just look at the compression artifacts in any regular picture from satellite or cable. A good LCD monitor (not high def) shows color banding in shadow areas that is appalling. But no one cares, because it's "only television." HBO, incidentally, is a prime offender.
I realize Wikipedia needs to come up with a more diplomatic way to express these concepts. That's why I am posting this to a discussion, not the actual page.
Politically, we have two problems: In the United States, the FCC chose not to establish a standard for HDTV, so we ended up with more than ten. And in other nations, establishing standards seems to have been, in part, a reflection of the desire to have a national identity. Color PAL came after color NTSC, which allowed an opportunity to improve picture quality; of course the Europeans could have attempted to make it compatible with US standards, at least to some extent, but (and I speak as a former European) no one liked the idea of slavishly following the Americans. And then the Japanese came up with their own idiosyncratic HDTV standard (completely impractical for over-the-air broadcasting by multiple competing stations in the US) at a time when they were the leaders in electronics, and wanted to demonstrate this. And so on.
I really think some of this background needs to be established in the Wikipedia entry, to explain why video is such an utter mess. The lack of guidance from the FCC regarding HDTV is particularly important; I interviewed the chairman of the FCC during this period (Reed Hundt) and he was opposed to intervening in any meaningful way. The FCC has no problem imposing standards of decency, but when it came to establishing a new standard that the broadcasters might not like (a function which one would assume is appropriate to a regulatory agency) they chose to do nothing, clearly for political reasons. Government has a very incestuous relationship with television in the USA: Legislators depend very heavily on media exposure, to get elected. No legislator wants to alienate broadcasters.
Along the way it might help, also, to mention why the NTSC digital video standard uses rectangular pixels, presumably because an old analog NTSC picture with a 4:3 aspect ratio could contain just enough information to justify 720 pixels per line, but not enough information to justify more than 480 lines per frame.
And, let's not forget overscan! Old cathode ray tubes could not maintain picture quality at the beginning and end of each (analog) scan line, so those parts were just thrown away. Even now, when you do video editing on high-end software such as Adobe Premiere Pro, the software includes a feature to warn you if titles (in particular) stray outside of the "safe zone" in the central area of the screen. Video has an entrenched history of being such a poor-quality medium, a director couldn't even be sure of how much of his picture would be displayed on a typical TV set. These concepts are very alien to anyone who deals in digital still photography, where you can crop an image to the nearest single row of pixels.
Some statements contrasting the factors affecting video with the factors affecting computer graphics would be helpful. Digital cameras shoot millions of pixels, in millions of colors, using relatively high quality compression that doesn't create visible artifacts. Digital camcorders capture maybe 1 megapixel (one-third of a megapixel in the old 4:3 aspect ratio) and inflict compression which is quite noticeable--unless they save onto tape, which is still the highest quality storage medium, incredibly. Why? Because video still pushes the limits of digital storage media. I can store 500 five-megapixel still photos on a 2GB flash card, with unnoticeable compression. At approximately 30 frames per second, those same 500 pictures would yield less than 20 seconds of video time. This is why video is still of such poor quality compared with digital still images. Bandwidth is the overriding factor, requiring significant compromises.
Incidentally the instruction manual that comes with Adobe Premiere Pro contains *the* best overview of incompatible video/film standards, and the horrible methods that are used to port a video stream from one to the other, including dropped frames, doubled frames, resampled frames, and on and on. Read it and weep.
Charlesplatt (talk) 07:44, 11 April 2008 (UTC)Charlesplatt
- I certainly agree that the article should be re-formed, a major task; and there is not nearly enough talk about the political posturing around the standardization that occurred - at least in the US. Two possible corrections: My understanding is that it's Comcast, or pick-your-least-favorite-cable-company, who compresses the hell out of the signal and causes the awful artifacty display. Also, Sony and Microsoft do have standards about "safe zone" display even on HDTV sets. I think the worry is about projection TVs rather than LCD sets. Tempshill (talk) 05:23, 19 November 2008 (UTC)
480p HD or not
There is little mention of 480p here. I'll to point out that the Movie industry seems to think that 480p video is 'HD', even though this article claims only video with 720 lines (or over) is high-definition. You just need to go to any Movie company's web site to see large lists of 480p video. OK, these are most likely viewed on a computer monitor and are rendered at 480p to enable playback on lower performance computers, but quite a few are also viewed on a HD-Ready television via a Internet acccesible TV set.
So should these 480p videos be classed as HD or not?
--Quatermass (talk) 10:18, 9 June 2008 (UTC)
- I've been saying no, but we may have to acknowledge that most sites do seem to have standardized on this: [2] Algr (talk) 16:15, 9 June 2008 (UTC)
- 480p is not HD. 480p (480p60, 480p30, 480p24) is defined in the US ATSC standards as ED (Enhanced Definition). (Folks who mistakenly connect a converter-box to HDTV sets can watch all broadcasts as ED (4:3).) 480i30 (SD) is de-interlaced by flat-screen HDTV sets to 480p30 before being displayed. EDTVs (Enhanced Definition TVs) display 480p fine, but they cannot display HD (720p, 1080i, 1080p). -Dawn McGatney 69.139.231.9 (talk) 07:06, 5 August 2008 (UTC) —Preceding unsigned comment added by 98.233.69.25 (talk)
Where is all the information in this article talking about ?
(The US, I guess, but who knows ?)
Much of the information in this article does not state which country is being described. Since no country has a monopoly on HD television, the article ought to make it clear to which country statements refer to. It makes no sense to make statements about cable as if this information applied to all cable everywhere. —Preceding unsigned comment added by 86.150.205.135 (talk) 21:40, 10 June 2008 (UTC)
HDTV 25th Anniversary of the introduction of HDTV in Europe (1982 - 2007)
High definition television comes of age thanks to ITU
Honorary plaque presented by BBC, CBS, NHK, EBU, IAB, DTV, Rebe Globo and Sony to the Radiocommunications Sector of the International Telecommunication Union, ITU-R, (formerly CCIR), on the occasion of the 25th Anniversary of the introduction of High-Definition Television in Europe:
High Definition Television (HDTV) is the most powerful and influential media in human experience. HDTV brings sharper pictures to television audiences, and provides a stunning viewing experience on today´s large screen HDTV television sets. It will shape the way we experience and see the world. HDTV is now replacing conventional television as all media channels make the transition to HDTV. Eventually all the world´s television viewers, current 4.5 billion, will be watching HDTV, thanks to the work of the ITU.
The idea of HDTV came from Japan, and from the fundamental work of the research laboratories of Japan´s national broadcaster, NHK. 2006 and 2007 marked the 25th anniversaries of the first demonstrations of HDTV, respectively in North America and in Europe.
Since 1972, the former CCIR Study Group 11 began to focus its activities on High-Definition Television recommending some of the characteristics of the first analogue systems, which, incidentally, produced base-band signals which could not fit in any of the broadcasting bands available at that time. Only after the introduction of digital techniques and over twenty years of continuous studies, in 1998, it was possible to unanimously approve Recommendation ITU-R BT.709 in its current version. It represents what is recognized today as an outstanding achievement of the ITU: the specifications for a single worldwide standard for HDTV production and program exchange, based on an image sampling structure of 1920x1080 pixels in what has become known as the Common Image Format (CIF).
The approval of Recommendation ITU-R BT.709 triggered a high level of activity on the part of manufacturers and broadcasters to develop the tools and the know-how to implement extensive HDTV program production. Since then Recommendation ITU-R BT.709 has remained unchanged, thus allowing manufacturers to produce equipment at an ever-lower cost. Not surprisingly such equipment today costs even less than any other comparable equipment for television systems of inferior quality, thus demonstrating the benefits offered by international broadcasting standards, and the success of the ITU in its role as an international standard-setting body.
Dr Timofeev, Director of the ITU-R, states "The world owes the pioneers and all those who worked in the ITU on HDTV a great debt of gratitude. We can only hope to emulate their success in the coming age of Super High Definition Television, and for future technologies of the media".
For more information, please see: http://www.itu.int/ITU-R/HDTV/ —Preceding unsigned comment added by 156.106.202.4 (talk) 14:55, 4 July 2008 (UTC)
Physics of HDTV
Perhaps someone with the appropriate knowledge could discuss the physics behind the development of high definition television. FitzColinGerald (talk) 19:07, 31 July 2008 (UTC)
lag?
why is there nothing about the lag they get? it's a known fact they have a lag from the video coming in to actually showing it 98.15.216.208 (talk) 00:42, 7 August 2008 (UTC)
- I second this request. Urban myth or real? How much lag and why? The Guitar Hero video games, in particular, have user interface screens to compensate for the supposed lag, so it's known now to millions. Tempshill (talk) 03:49, 19 November 2008 (UTC)
Grasp reality
What the fxxk is High-definition? http://wiki.riteme.site/wiki/High-definition_television 88.115.2.114 (talk) 15:32, 20 December 2008 (UTC)
- I think the definition is quite clear enough for the great majority of readers. Binksternet (talk) 23:11, 20 December 2008 (UTC)
The lead section of this article has an important omission! Although high - definition television (HDTV) refers to the digital television broadcasting system, it is also used to reference the display device that allows for the play back of the broadcast. The lead section should address the fact that there are two interchangeable meanings of the term “high-definition television” or “HDTV”. Within the body of the article, the term “high-definition television” is used to reference the display device (see the HDTV sources section); it is also used to reference the display device in some of the cited references for this article,[1][2] and in a U.S. government article.[3]
Here is a possible new first sentence(s) for the lead section.
High-definition television (or HDTV) refers to both the digital television broadcasting system and the accompanying display device that facilitates the transmission and reception, respectively, of higher resolution images. HDTV is designed delivery higher resolution images than the traditional television systems it is designed to replace, thus providing the end user with “more compelling audiovisual experiences”.[4]
The lead section should probably introduce the 16x9 screen ratio and also touch on the fact that the HDTV format is subsection of DTV, outline the various resolutions in use (with wiki links to those articles), etc. So that the lead section is self-explanatory of the article on its own.
- ^ http://www.tech-notes.tv/Archive/tech_notes_002.htm
- ^ http://www.answers.com/topic/hdtv-display-modes?cat=technology
- ^ http://www.dtv.gov/DTVOne-Pager.pdf
- ^ Research on Human Factors in Ultrahigh-Definition Television(UHDTV) to Determine Its Specifications By Masayuki Sugawara, Kenichiro Masaoka, Masaki Emoto, Yasutaka Matsuo, and Yuji Nojiri
Audley Lloyd (talk) 18:53, 22 April 2009 (UTC)
- Since when was HDTV limited to digital transmission systems? Every HDTV that I am aware of will accept an analogue 1125 line (interlaced) and some will accept an analogue 1126 line (progressive). The 720p standard is supported as well. They will however only accept them via the Component video (YCbCr) input. Every HDTV source that I possess will provide such output. 86.148.141.199 (talk) 16:44, 20 June 2009 (UTC)
Incorrect Info Regarding Cinema 4k
In the "High-Definition Display Resolutions" section, it sataes, "Currently, there is no HD Ready 2160p Quad HDTV format until 2015". This statement is untrue. I attended the I/ITSEC conference last year in December (a major annual modeling and simulation conference in Orlando). I visited the Sony booth where they were showcasing this very format on a large display. Perhaps it should be modified to say, "Currently, there is no STANDARDIZED HD Ready 2160p Quad HDTV format until 2015". Also, I'm not sure where the data 2015 comes from.
I'm not really sure how I can put a citation for something I physically saw. The conference papers can be found here: Conference Proceedings [3]
Mr3dPHD (talk) 14:57, 20 January 2009 (UTC)
Color
The section on color is odd. What did this Mexican inventor have to do with high definition color television? Putting aside that question, his first demonstration of electro-mechanical color television (his patent used a spinning color filter wheel) in 1946 came years after Baird, CBS, and RCA had demonstrated color electro-mechanical transmissions. Baird had even demonstrated an all electronic receiver in 1944. — Walloon (talk) 17:47, 7 May 2009 (UTC)
Indefinite article
The choice between "a HDTV" and "an HDTV" depends on the pronunciation of "H". However, as the H article suggests, the standard pronunciation is "aitch" (and "haitch" is non-standard). Therefore, "an HDTV" is correct. Oli Filth(talk|contribs) 12:01, 29 May 2009 (UTC)
- I agree. "A HDTV" is awkward to say, and I've never heard it that way. "An HDTV" is correct. Algr (talk) 20:13, 29 May 2009 (UTC)
- I agree too. I've reverted it again. --Harumphy (talk) 21:03, 29 May 2009 (UTC)
- Agree as much as you like. 'An' never precedes any word other than those starting with a vowel (That's 'A, 'E', 'I', 'O' and 'U') 'H' is not a vowel. It doesn't matter how awkward it is to say, English is nota language that tries to make pronounciation flow like French does. 86.145.21.227 (talk) 19:13, 31 May 2009 (UTC)
- Do you have any evidence that supports your theory that one shouldn't use "an" before "H" when pronounced as a letter, or for that matter before words such as "hour" or "yttrium"?
- Please note (before you make the same edit yet again) that Wikipedia works on a system of "consensus" (see WP:Consensus). In essence, it means that you shouldn't repeatedly and blindly make the same edits over and over again if multiple editors disagree with the change. Oli Filth(talk|contribs) 19:19, 31 May 2009 (UTC)
- Agree as much as you like. 'An' never precedes any word other than those starting with a vowel (That's 'A, 'E', 'I', 'O' and 'U') 'H' is not a vowel. It doesn't matter how awkward it is to say, English is nota language that tries to make pronounciation flow like French does. 86.145.21.227 (talk) 19:13, 31 May 2009 (UTC)
hour would be "an" because it has a soft H, however HDTV have a hard H so should be A. --Cameron Scott (talk) 19:24, 31 May 2009 (UTC)
- As above, according to the H article, the standard pronunciation of "H" is "aitch", i.e. starts with a vowel sound. If you pronounce it "haitch", that's up to you, but I think we should stick with the standard when it comes to grammar! Oli Filth(talk|contribs) 19:26, 31 May 2009 (UTC)
- Wikipedea does not work by consensus. It works by citation. Also it claims to be an encyclopedia so should be correct. Wikipedia, is of course, neither having fallen into disrepute a long time ago. Googling 'An versus a' throws up lots of authoritative citations (and unfortunately a number of amatear sites that don't agree on anything much.
- The softness or hardness of the 'h' has nothing to do with it. 'H' is not a vowel - check an authoritative . The practice doesn't even originate with the pronounciation being derived from the pronounciation of French words that have a silen 'h', the most notable example being 'hôtel'
- From Collins English Dictionary page 1:
- a, an: the indefinite article meaning one. an is used before words starting with vowels
- From Oxford dictionary of English usage also page 1:
- An is used instead a before nouns and adjectives beginning with a vowel. Placing an before an 'h' where the 'h' is silent is a common error. 86.145.21.227 (talk) 19:47, 31 May 2009 (UTC)
- Wikipedia works with citations for facts that are inserted into article content, not when it comes to, for instance, article structure or house style. Are you telling me that you would say "a MP" or "a XML document"? I think you would be in a very tiny minority!
- Maybe, but I would at least be right. 86.148.141.199 (talk) 16:39, 20 June 2009 (UTC)
- Wikipedia works with citations for facts that are inserted into article content, not when it comes to, for instance, article structure or house style. Are you telling me that you would say "a MP" or "a XML document"? I think you would be in a very tiny minority!
- Oh, and have you spotted the irony you've managed to build into your second quote? Oli Filth(talk|contribs) 22:13, 31 May 2009 (UTC)
- Actually, I didn't build it in as I scanned it straight out of the book. And you are correct, they have got it wrong (even by their own rules). They also seem to have missed an 'of' as the fifth word. Hmmm 86.148.141.199 (talk) 16:39, 20 June 2009 (UTC)
Well actually it is pronounced "Haitch Dee" as you would have it - just because your colloquial dialect means you are relaxed in your sentence structure doesn't mean to say that you are correct - you should NEVER pronounce a H as "aitch" it is a "lazy" form of speech. In proper English a H is "haitch" or is silent as in "hour" but never "aitch" - why don't you go and ask any year 1 or 2 pupils - seems they would have a better understanding of how to pronounce basic letters of the alphabet. Also if you are going to start saying that this is the way the majority of people say it try backing that up with some evidence - I don't know maybe a survey of all English speaking persons who use the site, other than the logic of "that is how I say it so it must be how everyone else says it" - Just because it is how you say it doesn't make it correct - END OF 86.129.34.17 (talk) 09:15, 1 June 2009 (UTC)
- Where I come from (England) educated people never use 'haitch'. That pronunciation is the exclusive preserve of the ignorant, along with 'skellington' and 'cerstificate'. BTW, before you post another comment on my talk page criticising my grammar, please learn the difference between your and you're. --Harumphy (talk) 09:33, 1 June 2009 (UTC)
- Again, do you have any evidence? I have the H article here, and the OED: [8]. Oli Filth(talk|contribs) 10:08, 1 June 2009 (UTC)
- Well my IP is that of the BT network in England - guess where that makes me from - yes well done England! If you could be bothered to do an IP Geo search you would find that I am currently studying at Cambridge University, which I guess by your elitist standards would make me well educated - also if you could highlight the section where I have misused "you're" and "your" I would greatly appreciate it, as I cannot seem to see where I have misused either. Also YOUR talk page??? Sorry who are you exactly??86.129.34.17 (talk) 13:57, 1 June 2009 (UTC)
- Sorry. I didn't look very carefully at the IP address and mistook you for 86.145.21.227, who posted some half-baked wibble on my talk page. I'm at a loss to know why I was expected to deduce that you are studying at Cambridge from your BT IP address. (I've no idea what an IP Geo search is.) I know lots of people with BT IPs and I hadn't noticed any correlation between their ISP and their alma mater. While you are studying at Cambridge, you might like to observe that most people there pronounce H as aitch not haitch. --Harumphy (talk) 16:07, 1 June 2009 (UTC)
- Pretty sure these are both edits from the same person. Oli Filth(talk|contribs) 23:48, 1 June 2009 (UTC)
- Very unlikely. Here in the UK we have dynamic IP addressing. Everytime you connect to the internet you are allocated the first unused IP address. Today I am 86.148.141.199 (talk) 16:31, 20 June 2009 (UTC)
- Two IPs from the same ISP making the same obscure edits within 24 hours of each other? Sounds fairly likely to me! Oli Filth(talk|contribs) 00:21, 21 June 2009 (UTC)
- Hi Anon! If you want to be recognized and have ongoing conversations, it is a good idea to get yourself a proper wiki name. Otherwise, given that your IP keeps changing, there is nothing that tells us that more then one person is talking. As for the pronunciation, you need to provide us with references that "haitch" is the preferred pronunciation of "H". So far all the documentation seems to point to the contrary. Wikipedia is based on references. Consensus is used for issues such as which references are valid or appropriate for a given subject, or judging the quality of writing. Algr (talk) 04:00, 22 June 2009 (UTC)
- Two IPs from the same ISP making the same obscure edits within 24 hours of each other? Sounds fairly likely to me! Oli Filth(talk|contribs) 00:21, 21 June 2009 (UTC)
- Very unlikely. Here in the UK we have dynamic IP addressing. Everytime you connect to the internet you are allocated the first unused IP address. Today I am 86.148.141.199 (talk) 16:31, 20 June 2009 (UTC)
- Pretty sure these are both edits from the same person. Oli Filth(talk|contribs) 23:48, 1 June 2009 (UTC)
- Sorry. I didn't look very carefully at the IP address and mistook you for 86.145.21.227, who posted some half-baked wibble on my talk page. I'm at a loss to know why I was expected to deduce that you are studying at Cambridge from your BT IP address. (I've no idea what an IP Geo search is.) I know lots of people with BT IPs and I hadn't noticed any correlation between their ISP and their alma mater. While you are studying at Cambridge, you might like to observe that most people there pronounce H as aitch not haitch. --Harumphy (talk) 16:07, 1 June 2009 (UTC)
lost in the technical details; virtually useless for the nonexpert
i think the article is virtually useless for someone who isn't already well-versed in the technicalities of HDTV. there needs to be more of a high-level overview section to the article, before diving into all the details. seems i'll have to look elsewhere to actually find out what HDTV is all about, as a generally interested consumer. (and i'm a scientist, so i have no problem with technical details, when i want them. but in most things i'm necessarily only capable of a high-level, broad-brush understanding and this article doesn't provide it for me.)
Disadvantages of HDTV?
Disadvantages of HDTV states about cable providers not providing the correct HDTV format for the subscribers. It also goes onto mention the confusion of the consumer to differentiate the different video formats. This is not a disadvantage of HDTV specifically. This can only be attributed to the lag time that takes to become familiar with a new technology. Also cable providers who do not provide the original format cannot be considered as responsible for the disadvantage of the HDTV technology. I hope someone would take the initiative to correct this misleading topic.Jyrejoice (talk) 14:04, 2 July 2009 (UTC)
- Consumer confusion stems from the introduction of HDTV; it can be discussed. If people are experiencing a lag time in getting familiar, then that can be discussed. If cable providers are broadcasting images that end up being shown in the wrong height:width ratio then that can be discussed. HDTV technology is not an ivory-tower concept; it's a real-world implementation. Binksternet (talk) 14:41, 2 July 2009 (UTC)
This is an archive of past discussions about High-definition television. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 |