Intel Gives Up On TV 89
symbolset writes "Bloomberg is reporting that Intel, on the cusp of having low-power embedded chips that can do true HD in a flatscreen, has given up on getting its chips embedded in TVs. While many might say their efforts to date have been fruitless because of energy issues, Medfield might have had a chance in this field."
Translation, please? (Score:4, Interesting)
Intel has been unable to provide a chip that offered significantly different performance from rival offerings, and failed to convince TV makers such as Samsung Electronics Co. or Sony Corp. that they needed its chips, Acree said.
OK, geeky people, what does that mean?
I interpret it as "producing chips for TVs is a commodity business and there's little opportunity to introduce anything new." Was Intel just late to the TV chip party and other chipmakers had it sewn up?
I would think even as a commodity producer, Intel would be competitive just because they have huge scale.
Re: (Score:2, Informative)
I took it to mean intel wants to cram x86 suckage into everything, to leverage their efforts at making a low-power x86 for UMPCs.
But nobody wants to run extant x86 apps on their TV, so everyone is happy with their ARMs (mostly), MIPS, etc.; if intel was willing to depend on someone else's IP, they could get back into the ARM business and just clean house (as you said, huge scale, and usually a process shrink or so ahead of their competitors), but x86 just isn't cutting it.
Re: (Score:2)
i never did understand why they got out of the ARM group.. their xscale cpu's where better than the competition by a long shot. Sure they where more expensive but it was worth it if you actually had to use the device. and if someone did manage to challenge them they have the scale/volume to drop prices and clean house.
Re: (Score:1)
Re: (Score:2)
thanks - didn't realize i missed that - I've never been good with spelling and correct grammatical use, i proof read everything but i still miss a lot.
Re: (Score:1)
Re: (Score:2)
i do wish you could edit comments to fix errors like that.
I do try not to perpetuate the norms that have formed here - but rather the norm of what was originally intended.
Not even games? (Score:2)
But nobody wants to run extant x86 apps on their TV
Not even games? Imagine a TV with a built-in PC that can connect to Steam, Impulse, and GOG to download games. Put your wireless mouse and keyboard on a TV tray, and PC gaming is back. Apple already makes a 21" and 27" model.
Re: (Score:2)
My wife won't let me, but I'd do it if I could.
Re: (Score:2)
Re: (Score:2)
Why not? TVs are getting "smart", and many of them are already connected to personal computers.
Might as well have a choice of proper windows PC TV without a mess of cables and having to use keyboard/mouse to navigate.
Re: (Score:2)
I can already do all that stuff without having the TV and PC share a case. My computers have HDMI out and my TV has VGA in.
Almost nobody else has what you have (Score:2)
My computers have HDMI out and my TV has VGA in.
Some people still have a CRT SDTV with only a composite input or an early 1080i CRT HDTV with only composite and component inputs. And even the video signals are compatible (VGA out to VGA in or HDMI or DVI out to HDMI in), not everybody has a computer in the same room as a TV. They might own only one computer and not want to have to carry it back and forth between the computer desk and the TV. It appears that in practice, statistically nobody is interested in buying a computer to hook up to the TV. (See pr
Re: (Score:1)
Good footnoting, there, tepples!
This is probably a function of time of purchase. I have not seen anyone buy a laptop that didn't have a TV-out of some sort (composite, DVI, HDMI or S-video) for years, but it seems like VGA inputs are just starting to become standard on HDTVs recently, and HDMI-out on video cards still isn't really widespread (although obviously it is already commonplace on gamer video cards). I just happen to have bought my first HDTV last solstice; early adopters are probably more limite
Re: (Score:2)
HDMI-out on video cards still isn't really widespread
DVI-D output has long been standard on even low-end video cards, even if not on desktop integrated graphics. I've seen several video cards with no VGA connector, just a DVI-A to VGA adapter hanging off a DVI-I port. My TV has an analog audio input next to one of its HDMI inputs, which appears to have been designed specifically for use with a DVI to HDMI cable and an analog audio cable.
But just because the port is there doesn't mean that TV owners A. know it's there or B. feel like using it. Is there anyt
Re: (Score:1)
That's a pretty good start on a howto, there (as somebody commented, you ought to find a Brit to add SCART).
My kids and octogenarian grandfather would have no problem following it. The pictures of connectors and corresponding tables work really well.
But my mother would never find it useful, because she doesn't want to know how to hook up electronics. She doesn't need to change her attitude, either - she's just fine the way things are! Similarly, I don't want to know how to deconstruct poetry or mine gyps
Re: (Score:2)
So I guess I'm saying "don't worry about it, you've already led the horses to the water. They'll drink when they're ready."
There are some video game genres that don't work well on the monitor connected to the average PC, but they work well on a larger, TV-size monitor. Take split- or otherwise shared-screen co-op games or party games in the vein of Bomberman or Smash Bros. It's kind of hard to fit two to four people holding gamepads around a 17" to 19" PC monitor. Such games have historically been released for consoles, but indie developers tend to be unable to afford the organizational overhead of console game development.
Re: (Score:1)
Samsung produces ARM chips which have smaller instruction sets than x86 chips and can run more efficiently than x86 chips in less robust applications such as TVs. While intel has made great strides in producing smaller and more efficient x86 chips, they are still just too bloated and power hungry.
Car analogy: Sticking a V8 in a golf cart. While the idea sounds appealing, it is not efficient for the designed purpose of the golf cart.
Re: (Score:2)
I think the problem might be that they found out that TVs doesn't really eat chips, at least not at the same rate as the viewers.
And your car analogy makes no sense at all. Why would my golf cart need a java script interpreter?
Re: (Score:3)
Re: (Score:3)
Re: (Score:1)
I read this and thought... Yeah I guess your point is that everything has to be on the net these days.
Then I thought, wait there is something more going on here. ....
Oh links, golf , browse.... And suddenly there is coffee all over my screen.
-Well played sir.
Re: (Score:1)
Re: (Score:2)
Last I read (in the last month or so) something like 70% of goods purchased in the US are made here. We are also still one of, if not the, largest exporter in the world. I don't remember the details though.
Of course 'made here' may only mean 'assembled here from parts made all over the world' - like a Boeing 787 - Japanese manufacturers have been making major parts of Boeing planes for at least two decades, and with the 787 major parts are made in (surprise, surprise) every one of Boeing's major market zo
Re: (Score:2)
Re: (Score:2)
I think you are correct, it was value.
Re: (Score:2)
V8 Lawnmower [youtube.com]
Turns out there are quite a few of these - Google is your friend. Not to mention Tim Allen's Turbo Mower in Home Improvement - The Great Lawn Mower Race [youtube.com] against Bob Vila.
Re: (Score:3, Interesting)
Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down. Point? Intel already dominated that market many years ago, but due to trends in management are too
Re:Translation, please? (Score:4, Informative)
Dickhead == Craig Barrett. Undid 25 years of Intel culture in less than a year. It took Otellini (for whom I have the greatest respect) almost two years to correct the Barrett fall out. But in the end, Intel pretty much makes decisions based on gross margin per wafer. They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)
Re: (Score:2)
They'll do strategic things for a while, but if the margin per wafer doesn't show up pretty soon, they kill the experiment. (Speaking of strategic, here's a fun game: The next time a salesman (or marketroid) tries to convince you to do some deal because "it's strategic", respond with "Oh, you mean it's no revenue." Enjoy deer-in-headlights face.)
Isn't that pretty much the definition of strategic? We're not great at making these kinds of products today, we lack the customer base, the experience and reputation. So we do projects at break-even or even possibly a slight loss in order to break into the market, because it's our strategy that we want to become an established player. If a) we aren't able to establish us or b) we do and there's still no profits then we don't keep doing what doesn't work.
Re: (Score:2)
Actually, no. That is not "pretty much the definition of strategic".
I don't like Microsoft, but they play an awesome strategic game. They invest in stuff, they buy stuff, the develop stuff - oftentimes, stuff that really has no future. But, do they ever get rid of any of that stuff? Not only "No", but "HELL NO!" Microsoft may put things on a shelf, and halt development, if it loses to much money - but they aren't about to get rid of anything. They can afford warehouses, terabyte on terabyte of hard dr
Re:Translation, please? (Score:5, Interesting)
Keep in mind up until 2000-something Intel was not only one of the largest chip manufacturers, but also one of the largest manufacturers of embedded controllers in the world. Some MBA dickhead under Otellini or his predecessor (the dude who fucked them to rambus for the first year of the P4's life) decided that embedded wasn't a high enough 'profit' division to hold onto and either sold it off or spun it down.
I worked for Intel during that period. Management was totally poisoned by the dot com disease. You could have a business plan that called for spending $50 million over five years to create a guaranteed $150 million a year product line with 25-40% margins and they didn't want to know. They were only interested in stuff that supposedly was going to produce a $500 million business a year in 18 months. They spent vast sums of money on second string chip companies, some of whom were already in trouble before the bottom fell out.
The other thing, they have this focus on margins that is deranged in that it's a straight percentage target that isn't adjusted depending on the market. In some markets, yes 60% is needed because you need to reinvest constantly in new designs. But there are other markets where 20% is more typical. Markets where the product life cycle is 36 months not 12. So they give up on stuff when the margins aren't their, forgetting that the proper metric is return on investment. Case, say you have a business, makes 150 mil in revenue, 20% margin, is 30 mil. If you have to invest 10 million a year in product design and what not to keep that business. Then you make 20 mil year off a 10 mil investment. 100% profit. But the way Intel sees it' it's only 20%, not enough to waste time on.
Then the dot bomb happened and they tossed overboard everything that wasn't going to turn a profit in 12 months. They also stopped development on product lines, thus killing them over the medium term. Of the dozen or so companies they bought 1998-2001 they closed all but one, and that only because being 12 and O would have been too embarrassing.
My impression is that Intel has a lot of capable people, and money to hire same. But the upper management has issues. It's like when they enter a business and find the other players are determined and competitive, customers who are used to wheeling an dealing to get the best value out of their vendors; management gets pissy and shuts everything down instead of sticking it out long enough to crave out some market share.
Re: (Score:2)
That was well said! However, i don't see this as an INTEL issue. Most of the companies I have worked for over the last 10 years seem to have shifted to the same mindset. Only short term returns and cutting heads to show a profit. All so management can make their bonuses. At some point it has to fail, I just hope that when it does the board sees it for what it is and makes a better choice when replacing them.
Re: (Score:2)
Re: (Score:2)
True, but then again, considering Intel is one of the top video card makers out there, does it really m
Re: (Score:1)
Intel has been on the cusp of producing the ultimate low-power chip that works everywhere from cell phones to microwaves to toaster ovens to tablets to watches [just about a year from now] for the past 10 years.
They ain't there. They ain't about to get there. And somebodies already there and been doing it for a long time.
Bullshit finally walks.
Re: (Score:2)
I was there. Intel doesn't "get" the TV biz. They're not gonna. This announcement is their admission that they don't get it.
This is sad because they were almost there. Their prior efforts were sad, but they had some prime shizzle in the pipe that had a legitimate chance.
Re: (Score:2)
I can agree to this (and was there too - as a former member of the Digital Home Group) - The Canmore [engadget.com] project for instance had some serious potential, but the thing was somewhat hobbled from the start (NDA prevents opinions as to exactly why, but let's just say that IMPO it could've done a lot more than it actually did).
The biggest problem was that they interrupted everyone on the Oregon side with a physical move (From CO to JF), and after that began the whole 'let's be a part of Viiv!' bullshit (Viiv? Yeah,
Re: (Score:2)
Not at all. There might be some market for a more capable chip at the same power consumption if the price is right, but Intel couldn't fulfill that demand. Their chips were either too expensive, not capable enough, or too power hungry for that application.
Other potential disqualifying factors might include not being willing to guarantee a long enough product lifetime. Unless there have been improvements, Intel's relatively weak debug and test interfaces could also play a role.
Last up, ARM and MIPS based chi
Re: (Score:1)
Was Intel just late to the TV chip party and other chipmakers had it sewn up?
I blame Doritos.
Re: (Score:2)
OK, geeky people, what does that mean?
It probably means these companies already have SoCs they use for this stuff (and may have a stakeholding in) and see no reason for ditching what they have for something produced by Intel.
Re: (Score:2)
I wonder if this CPU from a huge consortium of very powerful Japanese companies is finally making some progress?
Seven Japanese Companies to Develop Microprocessor to Compete Against AMD and Intel [xbitlabs.com]
Re: (Score:2)
Intel chips. In TVs! (Score:4, Interesting)
I remember a TV many years ago, perhaps late 90's or early 2k, which booted with a common Award BIOS screen and RAM check. I think we sold exactly one (and that one was the display model).
It was a useless device. Despite having a high-res CRT display with decent color, and a line doubler (which was potentially way cool in those pre-HDTV/DVI/HDMI times), it sucked: It irrevocably upscaled the output of a PSX, and the result was double-ugly instead of double-smooth since it got the field order precisely wrong.
It had an Intel CPU.
Is it dead now?
Good.
Thanks!
[/shallow]
Re: (Score:3)
Umm, that's not science. One data point means nothing.
Were we doing science here? I thought this was a message board.
Re: (Score:2)
I think the anecdote has something to add about the concept of "smart" televisions in general.
Re: (Score:3)
Dumb TVs don't have CISC CPUs trying to solve the world's problems.
"Smart" TVs do.
If you can't detect the difference, then there's nothing more for us to discuss on this matter.
Re: (Score:1)
Chips merely crunch numbers. The software running on (or embedded within) the chips is where the smarts are. [And, I
Re: (Score:2)
You actually want me to refute the rest of your post?
You really want me to refute every portion of your post? Seriously? What sort of weird masochistic pedant are you?
But as you wish:
Yes. So does my toaster, my thermostat, the remote for my car locks, and my flashlight. They do not have general-purpose CPUs, though, which is what Int
Re: (Score:1)
Funny how you ignored 90% of the content of that post and latched onto the one thing that you thought you could potentially contradict me on.
You really want me to refute every portion of your post? Seriously? What sort of weird masochistic pedant are you?
The kind that hangs out at slashdot, obviously (like 90% of the other wierdos here [including you]).
But, even dumb TVs have chips!
Yes. So does my toaster, my thermostat, the remote for my car locks, and my flashlight. They do not have general-purpose CPUs, though, which is what Intel is in the business of selling.
You know that Linux runs on all of those devices too, no? And, I think Intel sells a fair bit more than GPCPUs. You're the one who interpreted 'embedded' as 'general-purpose CPU' and 'smart'; not myself. See: http://www.intel.com/p/en_US/embedded/hwsw/hardware [intel.com]
This story has never been about 'general purpose CPUs' and 'smart' TVs. That is precisely the error I pointed out initially.
TVs are merely display devices.
TVs are radio receivers that include a display device. Display devices which do not include a radio receiver are called "monitors." (The "tele" in "television" is not without specific, direct, and obvious meaning.)
Of course. Do you rea
Re: (Score:2)
While I suspect that you'll accuse me of cherry-picking again because I dismissed the rest of your post, I think we can both agree on that. Although I'm still not entirely sure why you steered it in this direction...
HTPCs are for geeks (Score:2)
Ultimately people want to use the TV as a display for their 'smart' devices.
Then why do only geeks [pineight.com] ever buy or build a PC to hook up to the TV, as CronoCloud has pointed out [slashdot.org]?
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
If I were serious about entering this market though, I'd avoid p
What does this mean for Google TV? (Score:2)
The Google TV's from Sony use Intel chips according to their own marketing at least. Will Sony give up on Google TV or switch to ARM?
LCOS (Score:2)
come to think of it, Apple and Google also tried TV and failed
maybe computer companies should just stick to making computers and leave TV to Sony and Samsung
Good riddance (Score:2)
Good riddance, I'd say. I'm sick and tired of the 800 lb gorilla sticking its nose in everything that has more than a dozen transistors.
Intel Gives Up On TV (Score:5, Funny)
Intel Gives Up On TV
I don't blame them ... there are very few good programs and too many reality tv shows
As a former TV Product Planner (although (Score:3, Interesting)
not for Google TV).
Intel chips are expensive and these days you would be very much be expecting a highly integrated chip with demuxes and decoders for digital broadcasts, video and audio processing elements to improve the quality. There would typically be a whole bunch of functional units for most functions all baked onto the silicon. The General Purpose Processor would typically be fairly weak but with a lot of support. Main processors may get somewhat more powerful to support browser type technology but I wouldn't expect them to reach Intel Atom speeds in most cases for some time. Which would you rather have, a TV with a fast web browser or good picture processing?
The current Sony Google TVs (the integrated screens) still carry the same main chip as the rest of the Sony range in addition to the Intel processor and graphics. I'm not certain of the extent to which this is absolutely technically required or whether it was needed to use the existing TV reception and processing software. This means that the cost of the to build Google TV was like building a normal TV and adding a bare bones Atom PC. Expectation of pure additional sales, marketing funds from Intel and an expectation of smaller margins for retailers were what made the business case I understand although I think there were also some unreasonable assumptions particularly if you had ever tried the product.
http://techon.nikkeibp.co.jp/english/NEWS_EN/20101117/187451/ [nikkeibp.co.jp]
http://www2.renesas.com/digital_av/en/mpegdec_tv/emma3tl2.html [renesas.com]
If Intel do back away from the highly cost sensitive TV chip business I would expect Google to offer support for ARM. I think most of the TV manufacturers on or moving to ARM although MIPS was is certainly used in current models. The newer high performance ARM chips are a probably significantly more expensive than the typical TV processors but probably make more sense than the Intel Atoms with the ability to custom specify the chip features and still be cheaper.
Features on such chips will be specified by major manufacturers but the feature set will probably be locked down at least 18-24 months before the TV ships ruling out some things after that date.
The TV business is hugely competitive market and there is no profit in it (possibly with the exception of companies that have their own panel manufacturing). The combination of falling prices, long parts lead times and the importance of volume to get component prices make it a very tricky business to make money in. But it is key to many companies positions in the Consumer Electronics area and can bring leverage into other businesses (by enabling retail space, offering full product suites and increasingly giving scale to over the top online video offerings.).
if they are having power issues with TVs... (Score:1)
they should also give up on cell phones
Who needs a TV anyway (Score:2)
One of my computer monitors inputs is connected to the cable set-top box via HDMI
To watch TV I just push a switch on the monitor
silly question (Score:2)
While not everyone needs a TV, they're optimised for different things. My local cable company's standard def boxes don't have HDMI out, only component/composite. A lot of my gear is old enough that it doesn't do HDMI. My computer monitor doesn't have output ports like a TV does. My monitor has a much higher pixel density, but would suck for watching a movie with a bunch of other people.
Re: (Score:2)
They didn't want to do a concept that was good eno (Score:2)
Most companies never try to make something great (Apple being one exception, at least they try), most companies wants to make it as cheaply and as crappily as they can get away with. ie, getting away with = what people will still pay for.
Whilst calling Intel's offering crap might be too harsh, it was never promising or exciting, just their own little closed eco system. If we could get TV's running Honeycomb where you could sideload Apk's - you'd see such a TV system take of quickly, because all of the thing
CISC! (Score:1)