Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Television Technology

Intel Gives Up On TV 89

symbolset writes "Bloomberg is reporting that Intel, on the cusp of having low-power embedded chips that can do true HD in a flatscreen, has given up on getting its chips embedded in TVs. While many might say their efforts to date have been fruitless because of energy issues, Medfield might have had a chance in this field."
This discussion has been archived. No new comments can be posted.

Intel Gives Up On TV

Comments Filter:
  • Translation, please? (Score:4, Interesting)

    by afabbro ( 33948 ) on Wednesday October 12, 2011 @01:33AM (#37687362) Homepage

    Intel has been unable to provide a chip that offered significantly different performance from rival offerings, and failed to convince TV makers such as Samsung Electronics Co. or Sony Corp. that they needed its chips, Acree said.

    OK, geeky people, what does that mean?

    I interpret it as "producing chips for TVs is a commodity business and there's little opportunity to introduce anything new." Was Intel just late to the TV chip party and other chipmakers had it sewn up?

    I would think even as a commodity producer, Intel would be competitive just because they have huge scale.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      I took it to mean intel wants to cram x86 suckage into everything, to leverage their efforts at making a low-power x86 for UMPCs.

      But nobody wants to run extant x86 apps on their TV, so everyone is happy with their ARMs (mostly), MIPS, etc.; if intel was willing to depend on someone else's IP, they could get back into the ARM business and just clean house (as you said, huge scale, and usually a process shrink or so ahead of their competitors), but x86 just isn't cutting it.

      • by Amouth ( 879122 )

        i never did understand why they got out of the ARM group.. their xscale cpu's where better than the competition by a long shot. Sure they where more expensive but it was worth it if you actually had to use the device. and if someone did manage to challenge them they have the scale/volume to drop prices and clean house.

      • But nobody wants to run extant x86 apps on their TV

        Not even games? Imagine a TV with a built-in PC that can connect to Steam, Impulse, and GOG to download games. Put your wireless mouse and keyboard on a TV tray, and PC gaming is back. Apple already makes a 21" and 27" model.

        • I can already do all that stuff without having the TV and PC share a case. My computers have HDMI out and my TV has VGA in.

          • My computers have HDMI out and my TV has VGA in.

            Some people still have a CRT SDTV with only a composite input or an early 1080i CRT HDTV with only composite and component inputs. And even the video signals are compatible (VGA out to VGA in or HDMI or DVI out to HDMI in), not everybody has a computer in the same room as a TV. They might own only one computer and not want to have to carry it back and forth between the computer desk and the TV. It appears that in practice, statistically nobody is interested in buying a computer to hook up to the TV. (See pr

            • Good footnoting, there, tepples!

              This is probably a function of time of purchase. I have not seen anyone buy a laptop that didn't have a TV-out of some sort (composite, DVI, HDMI or S-video) for years, but it seems like VGA inputs are just starting to become standard on HDTVs recently, and HDMI-out on video cards still isn't really widespread (although obviously it is already commonplace on gamer video cards). I just happen to have bought my first HDTV last solstice; early adopters are probably more limite

              • by tepples ( 727027 )

                HDMI-out on video cards still isn't really widespread

                DVI-D output has long been standard on even low-end video cards, even if not on desktop integrated graphics. I've seen several video cards with no VGA connector, just a DVI-A to VGA adapter hanging off a DVI-I port. My TV has an analog audio input next to one of its HDMI inputs, which appears to have been designed specifically for use with a DVI to HDMI cable and an analog audio cable.

                But just because the port is there doesn't mean that TV owners A. know it's there or B. feel like using it. Is there anyt

                • That's a pretty good start on a howto, there (as somebody commented, you ought to find a Brit to add SCART).

                  My kids and octogenarian grandfather would have no problem following it. The pictures of connectors and corresponding tables work really well.

                  But my mother would never find it useful, because she doesn't want to know how to hook up electronics. She doesn't need to change her attitude, either - she's just fine the way things are! Similarly, I don't want to know how to deconstruct poetry or mine gyps

                  • by tepples ( 727027 )

                    So I guess I'm saying "don't worry about it, you've already led the horses to the water. They'll drink when they're ready."

                    There are some video game genres that don't work well on the monitor connected to the average PC, but they work well on a larger, TV-size monitor. Take split- or otherwise shared-screen co-op games or party games in the vein of Bomberman or Smash Bros. It's kind of hard to fit two to four people holding gamepads around a 17" to 19" PC monitor. Such games have historically been released for consoles, but indie developers tend to be unable to afford the organizational overhead of console game development.

    • by Anonymous Coward

      Samsung produces ARM chips which have smaller instruction sets than x86 chips and can run more efficiently than x86 chips in less robust applications such as TVs. While intel has made great strides in producing smaller and more efficient x86 chips, they are still just too bloated and power hungry.

      Car analogy: Sticking a V8 in a golf cart. While the idea sounds appealing, it is not efficient for the designed purpose of the golf cart.

      • I think the problem might be that they found out that TVs doesn't really eat chips, at least not at the same rate as the viewers.

        And your car analogy makes no sense at all. Why would my golf cart need a java script interpreter?

      • V8 Lawnmower [youtube.com]

        :D

        Turns out there are quite a few of these - Google is your friend. Not to mention Tim Allen's Turbo Mower in Home Improvement - The Great Lawn Mower Race [youtube.com] against Bob Vila.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down. Point? Intel already dominated that market many years ago, but due to trends in management are too

      • by dbc ( 135354 ) on Wednesday October 12, 2011 @02:21AM (#37687548)

        Dickhead == Craig Barrett. Undid 25 years of Intel culture in less than a year. It took Otellini (for whom I have the greatest respect) almost two years to correct the Barrett fall out. But in the end, Intel pretty much makes decisions based on gross margin per wafer. They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)

        • by Kjella ( 173770 )

          They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)

          Isn't that pretty much the definition of strategic? We're not great at making these kinds of products today, we lack the customer base, the experience and reputation. So we do projects at break-even or even possibly a slight loss in order to break into the market, because it's our strategy that we want to become an established player. If a) we aren't able to establish us or b) we do and there's still no profits then we don't keep doing what doesn't work.

          • Actually, no. That is not "pretty much the definition of strategic".

            I don't like Microsoft, but they play an awesome strategic game. They invest in stuff, they buy stuff, the develop stuff - oftentimes, stuff that really has no future. But, do they ever get rid of any of that stuff? Not only "No", but "HELL NO!" Microsoft may put things on a shelf, and halt development, if it loses to much money - but they aren't about to get rid of anything. They can afford warehouses, terabyte on terabyte of hard dr

      • by Anonymous Coward on Wednesday October 12, 2011 @02:51AM (#37687660)

        Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down.

        I worked for Intel during that period. Management was totally poisoned by the dot com disease. You could have a business plan that called for spending $50 million over five years to create a guaranteed $150 million a year product line with 25-40% margins and they didn't want to know. They were only interested in stuff that supposedly was going to produce a $500 million business a year in 18 months. They spent vast sums of money on second string chip companies, some of whom were already in trouble before the bottom fell out.

        The other thing, they have this focus on margins that is deranged in that it's a straight percentage target that isn't adjusted depending on the market. In some markets, yes 60% is needed because you need to reinvest constantly in new designs. But there are other markets where 20% is more typical. Markets where the product life cycle is 36 months not 12. So they give up on stuff when the margins aren't their, forgetting that the proper metric is return on investment. Case, say you have a business, makes 150 mil in revenue, 20% margin, is 30 mil. If you have to invest 10 million a year in product design and what not to keep that business. Then you make 20 mil year off a 10 mil investment. 100% profit. But the way Intel sees it' it's only 20%, not enough to waste time on.

        Then the dot bomb happened and they tossed overboard everything that wasn't going to turn a profit in 12 months. They also stopped development on product lines, thus killing them over the medium term. Of the dozen or so companies they bought 1998-2001 they closed all but one, and that only because being 12 and O would have been too embarrassing.

        My impression is that Intel has a lot of capable people, and money to hire same. But the upper management has issues. It's like when they enter a business and find the other players are determined and competitive, customers who are used to wheeling an dealing to get the best value out of their vendors; management gets pissy and shuts everything down instead of sticking it out long enough to crave out some market share.

        • That was well said! However, i don't see this as an INTEL issue. Most of the companies I have worked for over the last 10 years seem to have shifted to the same mindset. Only short term returns and cutting heads to show a profit. All so management can make their bonuses. At some point it has to fail, I just hope that when it does the board sees it for what it is and makes a better choice when replacing them.

        • by swalve ( 1980968 )
          That's an MBA disease. They got the important message: sometimes it's better to think in percentages. They missed the less important message: 1% of $1million is still more than 50% of $1000.
      • by tlhIngan ( 30335 )

        This is the same thing that happened to the i740 successors and larrabee (both of which sucked engineering-wise, but whose basic premise should've been kept: Intel could've been in the videocard market where nVidia/ATI are now. But some shortsighted manager decided it made more sense to cut their losses than to persevere and get the product right so the next generation and the one after that would succeed.

        True, but then again, considering Intel is one of the top video card makers out there, does it really m

    • Intel has been on the cusp of producing the ultimate low-power chip that works everywhere from cell phones to microwaves to toaster ovens to tablets to watches [just about a year from now] for the past 10 years.

      They ain't there. They ain't about to get there. And somebodies already there and been doing it for a long time.

      Bullshit finally walks.

    • I was there. Intel doesn't "get" the TV biz. They're not gonna. This announcement is their admission that they don't get it.

      This is sad because they were almost there. Their prior efforts were sad, but they had some prime shizzle in the pipe that had a legitimate chance.

      • I can agree to this (and was there too - as a former member of the Digital Home Group) - The Canmore [engadget.com] project for instance had some serious potential, but the thing was somewhat hobbled from the start (NDA prevents opinions as to exactly why, but let's just say that IMPO it could've done a lot more than it actually did).

        The biggest problem was that they interrupted everyone on the Oregon side with a physical move (From CO to JF), and after that began the whole 'let's be a part of Viiv!' bullshit (Viiv? Yeah,

    • by sjames ( 1099 )

      Not at all. There might be some market for a more capable chip at the same power consumption if the price is right, but Intel couldn't fulfill that demand. Their chips were either too expensive, not capable enough, or too power hungry for that application.

      Other potential disqualifying factors might include not being willing to guarantee a long enough product lifetime. Unless there have been improvements, Intel's relatively weak debug and test interfaces could also play a role.

      Last up, ARM and MIPS based chi

    • Was Intel just late to the TV chip party and other chipmakers had it sewn up?

      I blame Doritos.

    • by DrXym ( 126579 )

      OK, geeky people, what does that mean?

      It probably means these companies already have SoCs they use for this stuff (and may have a stakeholding in) and see no reason for ditching what they have for something produced by Intel.

    • I wonder if this CPU from a huge consortium of very powerful Japanese companies is finally making some progress?
      Seven Japanese Companies to Develop Microprocessor to Compete Against AMD and Intel [xbitlabs.com]

    • by jon3k ( 691256 )
      If AppleTV and GoogleTV are any indication, saying that there isn't much opportunity to advance the television might be the understatement of this decade.
  • Intel chips. In TVs! (Score:4, Interesting)

    by adolf ( 21054 ) <flodadolf@gmail.com> on Wednesday October 12, 2011 @01:45AM (#37687398) Journal

    I remember a TV many years ago, perhaps late 90's or early 2k, which booted with a common Award BIOS screen and RAM check. I think we sold exactly one (and that one was the display model).

    It was a useless device. Despite having a high-res CRT display with decent color, and a line doubler (which was potentially way cool in those pre-HDTV/DVI/HDMI times), it sucked: It irrevocably upscaled the output of a PSX, and the result was double-ugly instead of double-smooth since it got the field order precisely wrong.

    It had an Intel CPU.

    Is it dead now?

    Good.

    Thanks!

    [/shallow]

  • The Google TV's from Sony use Intel chips according to their own marketing at least. Will Sony give up on Google TV or switch to ARM?

  • liquid crystal on silicon -- didnt they try this TV chip awhile back too and fail?

    come to think of it, Apple and Google also tried TV and failed

    maybe computer companies should just stick to making computers and leave TV to Sony and Samsung
  • Good riddance, I'd say. I'm sick and tired of the 800 lb gorilla sticking its nose in everything that has more than a dozen transistors.

  • by Chrisq ( 894406 ) on Wednesday October 12, 2011 @03:00AM (#37687688)

    Intel Gives Up On TV

    I don't blame them ... there are very few good programs and too many reality tv shows

  • by JTL21 ( 190706 ) on Wednesday October 12, 2011 @03:51AM (#37687860) Homepage

    not for Google TV).

    Intel chips are expensive and these days you would be very much be expecting a highly integrated chip with demuxes and decoders for digital broadcasts, video and audio processing elements to improve the quality. There would typically be a whole bunch of functional units for most functions all baked onto the silicon. The General Purpose Processor would typically be fairly weak but with a lot of support. Main processors may get somewhat more powerful to support browser type technology but I wouldn't expect them to reach Intel Atom speeds in most cases for some time. Which would you rather have, a TV with a fast web browser or good picture processing?

    The current Sony Google TVs (the integrated screens) still carry the same main chip as the rest of the Sony range in addition to the Intel processor and graphics. I'm not certain of the extent to which this is absolutely technically required or whether it was needed to use the existing TV reception and processing software. This means that the cost of the to build Google TV was like building a normal TV and adding a bare bones Atom PC. Expectation of pure additional sales, marketing funds from Intel and an expectation of smaller margins for retailers were what made the business case I understand although I think there were also some unreasonable assumptions particularly if you had ever tried the product.
    http://techon.nikkeibp.co.jp/english/NEWS_EN/20101117/187451/ [nikkeibp.co.jp]
    http://www2.renesas.com/digital_av/en/mpegdec_tv/emma3tl2.html [renesas.com]

    If Intel do back away from the highly cost sensitive TV chip business I would expect Google to offer support for ARM. I think most of the TV manufacturers on or moving to ARM although MIPS was is certainly used in current models. The newer high performance ARM chips are a probably significantly more expensive than the typical TV processors but probably make more sense than the Intel Atoms with the ability to custom specify the chip features and still be cheaper.

    Features on such chips will be specified by major manufacturers but the feature set will probably be locked down at least 18-24 months before the TV ships ruling out some things after that date.

    The TV business is hugely competitive market and there is no profit in it (possibly with the exception of companies that have their own panel manufacturing). The combination of falling prices, long parts lead times and the importance of volume to get component prices make it a very tricky business to make money in. But it is key to many companies positions in the Consumer Electronics area and can bring leverage into other businesses (by enabling retail space, offering full product suites and increasingly giving scale to over the top online video offerings.).

  • they should also give up on cell phones

  • One of my computer monitors inputs is connected to the cable set-top box via HDMI
    To watch TV I just push a switch on the monitor

    • While not everyone needs a TV, they're optimised for different things. My local cable company's standard def boxes don't have HDMI out, only component/composite. A lot of my gear is old enough that it doesn't do HDMI. My computer monitor doesn't have output ports like a TV does. My monitor has a much higher pixel density, but would suck for watching a movie with a bunch of other people.

    • by tepples ( 727027 )
      Does your computer monitor or your cable set-top box also upscale composite video so that you can use legacy video sources, such as a VHS VCR (for cult classics that still haven't been rereleased on DVD due to licensing complexities) or a non-HD video game console?
  • Most companies never try to make something great (Apple being one exception, at least they try), most companies wants to make it as cheaply and as crappily as they can get away with. ie, getting away with = what people will still pay for.

    Whilst calling Intel's offering crap might be too harsh, it was never promising or exciting, just their own little closed eco system. If we could get TV's running Honeycomb where you could sideload Apk's - you'd see such a TV system take of quickly, because all of the thing

  • No wonder. Want low-power? Look at ARM. CISC devices cannot consume as little power as RISC - they have to pay for the extra features.

There's no sense in being precise when you don't even know what you're talking about. -- John von Neumann

Working...