Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software

Sony Announces GScube Development System 120

A reader writes: "At the SIGGRAPH 2000 computer graphics conference this week, Sony Computer Entertainment announced its 'strategic vision for the evolution of computer entertainment in the broadband network era.' At the core of this vision is a piece of hardware which is composed of the same building blocks as the PlayStation2, but multiplied many times. The GScube development system, as it is called, will be used in the development of 'e-cinema,' computer graphics movies and other new digital content. The GScube prototypes are powered by 16 Emotion Engines and 16 Graphics Synthesizers. Yes, this is 16 PlayStation2s rolled into one. It is NOT a game system, though, so don't write into Q&A asking about what games will be released for it. It's a computer graphics workstation, comparable to the systems created by Silicon Graphics and other companies to produce high-quality CG movies. It'll be a bit too expensive for you to pick up at your local Toys R Us." So, the going theories about Emotion Engine as general purpose CPU seem to pan out.
This discussion has been archived. No new comments can be posted.

Sony Announces GScube Development System

Comments Filter:

  • They are cramming all that CPU power/bus bandwidth/fill rate into their system but are still limiting the color depth to a pitiful 32 bits per pixel (8bits per channel)??

    Really, besides the enormous amount of CPU and bus bandwidth, there really isn't anything here that can't be found on the average 1999-2000 consumer PC hardware.
  • The difference between this machine and a server is the Graphics Synthesizer. Remember this machine doesn't have 16 cpus. It has 32. 16 sets of Emotions Engines and Graphics Synthesizers(GS). The GS will be dedicated to graphics work, and compared to the GS in the PS2 which has 4MB of on-die cache, and the new SGI's with 8MB, this machine will have a whopping 32MB of on-die cache. And much like the new SGI's this machine will be modular; hence the cube design. Also, I don't think Sony will be aiming these machines at IT managers. According to a Gamespot article ( http://www.zdnet.com/gamespot/stories/news/0,10870 ,2606952,00.html ) the GScube is suppose to work in conjuction with SGI servers.
  • Remember IBM and their "Computers are for business" attitude whenever asked about better sound and, god forbid, colour graphics?

    Same thing with early Compuserve (are they still around?).

    Or remember [insert your college's policy here] on playing games on the UNIX servers? (i.e., net trek and other early net games that just didn't work on home computers)?

    Face it the best systems for games are the really expensive cutting edge ones. Stop looking at game playing/development as some sort of crime against science. Relax!

  • RISC IV
  • No, it's something designed to model and render CG movies with. Who in their right mind would use it for a webserver?
  • It's a MIPS core, with a couple of nice vector units. arstechnica have a couple of overview articles:

    http://arstechnica.com/reviews/1q00/playstation2 /ee-1.html
  • It's rackmounted. So, the width is set by mounting.
    It's not very deep, so even if installed in 600mm deep rack (480mm max install depth), it has some space for cables and air flow.
    It's 10U high. Probably it just didn't fit into 8U case being only 16.6" deep.

    Not everything needs to be cube because it was original some day, it may be a cube because the parts happen to fit best into that kind of case.
  • .
    Yes, the export restrictions on the PS2 got plenty of media attention, yes, there were plenty of authortative sources cited telling you exactly how many PS2s you could take home.

    But about a month ago, a friend of mine traveled to Japan, and brought back eight PS2s. Yes, eight (8) of them.

    With no problem. In fact, he asked customs about it when he arrived, and again, just before he bought them, and they laughed and told him that they had seen the news, but that there had never been any official hold on PS2s. In fact, one of them actually said something along the lines of "it's just for publicity".

    Now, this is one person, and I have no idea where he came in/went out of Japan (other than by air between the US and Japan), but I would think that such a loose attitude would not exist in customs officials unless it really was just a PR stunt.

    --
    Evan (YMMV, IANAJ)

  • That's exactly where the submission linked.

    Not the first time a submission is the first two paragraphs of the article it links to.
  • Nintendo Star Cube, Apple G4 Cube, Sony GScube...cubes are the hot new trendy thing these days!

    You can make anything seem cooler by adding "cube" to the name. For example, "Sack of Shit" = bad, but "ShitCube" = good.

  • but are still limiting the color depth to a pitiful 32 bits per pixel (8bits per channel)??

    Why not, since the CRT (or LCD) you're going to display that on isn't capable of any better. (TV screens are even worse, since the transmission standards impose a contrast range even lower than what the hardware is capable of.)

    Now, when something like micromirror projectors become more widely available, increasing that might make more sense. (The Cineon image format for motion picture work uses I think 10 bpp, on a logarithmic scale to correspond to photographic film's range.)
  • You need to store intermediate results without losing precision for multipass rendering effects. Most multipass rendering effects can be accomplished in one pass on the PSX2, but there are still places (transparency being the most obvious) where you are modifying an existing value in the back buffer.

    I would like a floating point color buffer personally, with stencil and accumulation buffers hardware accelerated.
  • It uses Rdram because the playstation2 that is the base for the thing uses Rdram. Rdram actually doesn't have that huge overhead we see in PC's when it's only the 16 MB (or so I don't remember the actual amount off hand, but it's less than 64)on a console machine. If it is in fact simply 16 PS2 machines slapped together in a new cube shaped box then the actual Rdram size shoudl eb small enough to avoid teh problems desktops machine shave in regards to Rdram latency.
  • I guess the regular PlayStation2 wasn't challenging enough to program for... Seriously, do you run RenderMan on this thing? What would you do with the Graphics Synthesizers then? Will it support OpenGL 1.2? Would you be using the other 15 EE's as coprocessors then? I at least give credit to Sony for sticking with their ambitious plans for the EE, but come on, this seems really presumptious -- glomming 16 PS2's together and calling it macaroni.

    What could possess them to put this system together? Sixteen-way multiplayer? Did Square order a bunch of these? Hmm. Wait, I've got it... they're demo machines for showing pre-release PS2 games on at E3 and ECTS! Now it all makes sense. :^)

  • okay, look, i know Apple owns all the NeXT stuff, but seriously: the PlayStation2 looks like a NeXTStep and now that they're coming out with a cube... is it gonna be big and black, 'bout a couple feet square?

    there's even some similarities between the colors of the PlayStation logo and the colors of the NeXT logo...

    anybody else see this?

  • The biggest value of any form of real time content generation is interactivity. There are many possibilities e.g some sort of interactive movie with viewer participation?
  • What the heck is a Nintendo Star Cube?

    The Star Cube is going to be Nintendo's next game console -- it was recently renamed from "Dolphin".

  • is that supposed to be "G-S-Cube" or "G-Scoob",
    as in "G-Scube, I wonder where Fred and Daphne went?"
  • For some reason, I don't think the GSCube was designed with audio in mind. It may have something to do with the fact that it's a graphics workstation, but that's just my guess.
  • The chipset in the PS2 is just a fraction of the cost of the whole unit. It also has a DVD drive, 32 MB of RAM, USB port/controller and a harddrive interface. Most of these other components would not be duplicated 16 times, but 512 MB RAM standard would be nice.
  • There is also the practice of virtual sets in which an actor/newscaster is filmed against a blue-screen with a motion-recording camera. A realistic 3d environment is rendered and composited around the actor in real-time, using the correct camera positioning.
    Also, as mentioned elsewhere, are the virtual actors (annanova, 'duke', etc.), and real time ad compositing and placement during sporting events and newscasts.
  • Taken from SGI's site:

    Onyx2 Infinite Reality3 Multi-Rack system
    256MB to 256GB RAM
    up to 128 R12K processors and 16 visualization units
    Up to 200M poly/sec
    up to 7GPixel/sec fill rate
    320MB frame buffer, 1GB texture RAM
    8-tap AA

    Onyx2 Infinite Reality3 Single-Rack system
    256MB to 16GB RAM
    up to 8 R12K processors and two visualization units
    13.1M poly/sec per visualization unit
    up to 480MB frame buffer
    8-tap AA

    Onyx2 Infinite Reality3 Deskside system
    256MB to 8GB RAM
    up to 4R12K processors
    80 or 160MB frame buffer
    64MB texture RAM
    13.1M poly/sec
    8-tap AA

    In other words, in a machine that takes up about 4 cubic feet of space and requires a single outlet, Sony can outperform a vastly more expensive (and much larger), top-of-the-line SGI. However, SGI still has the GSCube whipped for RAM. In Sony's favor, though, the GSCube is designed to be a workstation, something which the Onyx2 IR3 can not claim.
  • The Nintendo Star Cube is the supposed name of Nintendo's new game console, currently codenamed "Dolphin". Nintendo will allegedly clear this up at a trade show in late August. Many people have trashed the "Star Cube" name, but I seriously think it sounds kinda cool.
  • interactivity, real-time raytracing, high-bandwidth connection...

    More buzzwords in case they're running out of ideas for the next press release.

    How about an interactive streaming internet tv show, where we get to vote someone off of the island.(Or out of the dev-team)

  • by EnglishTim ( 9662 ) on Tuesday July 25, 2000 @12:25PM (#907449)
    Okay, here's a press release we released today - basically it means that Renderware and Rendervision will be avaliable on the GSCube - Renderware is a middleware cross-platform graphics library, and RenderVision is this cool system that allows you to preview your scene (and walk through) your scene that you are designing in your 3D package instantly on your target platform (like PS2, for instance...)

    (Actually, I work for the Fiendish Games [fiendishgames.com] part of Criterion...)

    Criterion Software [csl.com] delivers RenderWare [renderware.com] on Sony Computer Entertainment's GScube Development System

    RenderWare ushers in the future of real-time digital creation in the broadband era

    NEW ORLEANS, July 25th, 2000 - Criterion Software [csl.com] today announced a major milestone in its mission to provide a seamless digital content development path from concept to delivery, with the demonstration of its market leading Renderware3 [renderware.com] interactive 3D graphics middleware driving Sony Computer Entertainment Inc.'s GScube visualizer prototype for e-cinema and real-time digital content creation.

    A subsidiary of Canon, headquartered in Guildford, UK, Criterion Software Ltd. [csl.com] is the leading developer of multimedia middleware for convergence platforms, including next generation videogame consoles, digital televisions and web terminals.

    "We reshaped the world of 3D middleware with the invention of our Renderware3 [renderware.com] Powerpipe architecture." said David Lau-Kee, President, Criterion Software [csl.com], "Powerpipe is a uniquely flexible graphics software architecture that allows unprecedented control over content-specific processing, special effects and acceleration. Now, driving the GScube development system, the benefits of this flexibility towards the creation of insanely inventive digital content by studios across all forms of visual entertainment become startling clear."

    Subject to agreement, Criterion Software [csl.com] expects to provide its 3D middleware and tools, including Renderware [renderware.com] for 3D programmers and RenderVisionTM for 3D artists, to GScube application and content developers in the near future.

    "Criterion Software is a valued partner for our PlayStation2 middleware program," said Makoto Hatakenaka, Vice President, Sony Computer Entertainment Inc. "and we are pleased that they are providing the same high level of commitment and support to GScube, our next step in pioneering the creation of real-time digital entertainment.

    Sony Computer Entertainment Inc.
    Recognized as the undisputed global leader and company responsible for the progression of consumer-based computer entertainment, Sony Computer Entertainment Inc. (SCEI) manufacturers, distributes and markets the PlayStation game console and PlayStation2 computer entertainment system. SCEI, along with its subsidiary divisions Sony Computer Entertainment America Inc. and Sony Computer Entertainment Europe Ltd., develops, publishes, markets and distributes software, and manages the third party licensing programs for these two platforms in the respective markets worldwide. Headquartered in Tokyo, Japan, Sony Computer Entertainment Inc. is an independent business unit of Sony Corporation.

    Criterion Software [csl.com]
    Criterion Software develops and distributes the market leading 3D middleware tools for convergence platforms, including RenderWare3 for 3D programmers and RenderVision for 3D artists, and distributes complementary 3rd party tools, such as Metrowerks CodeWarrior® game development tools for PlayStation2 computer entertainment system in Europe and North America.

    RenderWare is a registered trademark of Canon Inc. RenderVision is a trademark of Criterion Software Ltd. PlayStation is a registered trademark of Sony Computer Entertainment Inc. All other company and product names may be registered trademarks or trademarks of their respective companies/holders, and are hereby recognized.
  • Looks like the claimed specs for the 3DFX Voodoo5 6000!

    *joke*

  • Mainly because the things designed to do floating point. Its kind of like how those 64 CPU Hitachi vector units end up in the top few flots along side 512CPU SGI units.
  • well... The Playstation2 Dev kit uses Linux as its OS so I'm assuming that this machine will use Linux also.
  • The current "top of the line" SGI spec wise?

    Also for all those saying this won't be a good game box, read the goddamn article.

  • Yep, a 1Ghz Athlon can barely manage 1 GFLOP. Same goes for the G4 500Mhz. and a P3 can't even do 1Gflop(at this time) The CPUs in the IBM's ASCI White Supercomputer also hovers somewhere above 1GFLOP(of course they have thosands of them in the machine).
  • heh. Look at my user info page... I actually have decent comments.
  • Sony is a lot better equipped than most companies to walk into an office and say "Hmmm, we'd like you to port, say, Maya over to our new Playstation-on-steroids. How much cash would you like in return?"
  • by Anonymous Coward
    Hi, Although I am a *massive* supporter of BeOS, I do doubt that the OS for this thing will be BeOS for a few reasons... For a new unsupported hardware platform like this the last thing that it wants to run will be a new'ish Operating System like BeOS. There are a few things that are still very immature on the BeOS at the moment. Plus with Be In's focus shift to the IA market with the BeIA (Bah!!!), the BeOS existence is still in question... However, with all those multi-processors, I would *really love* to see BeOS running a RC5-64 client on that :-) Thanks Andy
  • I think that when people refer to it as CGI in that context, they mean 'Computer Generated Images'. So really, it isn't redundant, nor incorrect. Just a different way to put it.
  • *@#&$

    I *thought* it was in rather poor taste not to indicate the source. My mistake.

    Thanks. Back to snorkeling mode.

  • "a) what OS this will run and "

    The OS will be Final Fantasy. The cursor is just a character that runs around a map to different towns(directories) and enters shops(programs).

    That's all I can think of for now.

    joel
  • the BeOS existence is still in question

    There's absolutely no question in my mind! They've made it clear that they will continue to support the OS, and they have. The BONE (BeOS Networking Environment) and OpenGL rewrites should be proof enough. Plus, I don't know if you're on the BeUserTalk list, but there is a massive ammount of support coming from people outside of Be, Inc. Even from within Be, I've always gotten responses right away, and not auto-responses either :) Real people at the keyboards there.
    -G

    Linux is only Free if your time is worth Nothing

  • Seems to me that the seems to bein the article quote is important, here.

    There is a tendency to be 'net centric in this day and age, which, in this case, may blind us to the observation that there are many LBE (location based entertainment) applications (for example) for a device like this. Regular visitors to SIGGRAPH may remember the popularity of "The Cave" and other immersive environments including Egghead Shred, an interactive game involving 3500+ players in real time.

    What this may really be about is the continued trend of commoditization of hardware. If this were just 16 PS2s, then I could buy them for about $6500. Assuming a 2.5x multiple (because this is not a consumer device), we're still talking about $20k. Compare this against SGI Onyxs and similar workstations. This has been a trend since 1984, when I was with the company that introduced the first 24-bit paint system on an MS-DOS-based PC at SIGGRAPH. It was a fraction of the cost of competing proprietary systems.

    While software is very important, it may be that extensions of game development systems and not more traditional 3d modeling/ rendering/animation systems are what's needed. I don't think that weather rendering for broadcast graphics is the best application (though at the price it might be cheaper than existing alternatives and my guess is we'll see something like this at NAB next year), but the SciVi (Scientific Visualization) camp will really love this tool -- and that's where I'd expect to see the GSCube shine at next year's SIGGRAPH, in addition to LBE uses.

    FWIW. The 32-bit (RGBA) limit is not all that distressing (to me). The real need for more than 8-bits/pixel comes in when doing many layers of compositing -- to compensate for rounding errors in calculations. With not enough bit depth you get artifacts, eventually. If everything is being rendered in "real time" w/sub-pixel accuracy, then bit-depth issues associated w/compositing go away. Also remember that this is motion, not still imagery, and that the eye is very forgiving in trade-offs between temporal bandwidth and physical/color resolution (bandwidth).

  • Unless you are talking about numbers for a specific benchmark (Linpack for instance) peak for a 1 GHz Athlon would be 4 GFLOP's using 3DNow!. Thats single precision, but then so is what the emotion engine uses.
  • If they should go linux and provide OpenGL, then the road to Maya might be shorter than you think, considering that Maya is being ported to linux [slashdot.org].

  • One wonders if they'll iron out all the antialiasing/memory bus difficulties they're still having with the PS2...
  • Furthermore, unless I missed it, there was no mention of what operating system it's going to run

    Windows '95 OSR2. Complete with real time rendered 3d cards in Solitaire of course. This ought to increase the user base for windows by at least 13! I know I sure couldn't live without 3d rendered cards.

    :P

  • >a pitiful 32 bits per pixel

    Never mind that the human eye can't resolve color even that well (about 16bpp, glad you asked). Sure, there are some fun things you can do by treating non-color attributes - e.g. transparency - as though they were colors, increasing the on-paper bpp, but when it comes to the output this little numbers game is of no benefit to consumers.



    >besides the enormous amount of CPU and bus bandwidth, there really isn't anything here that can't be found on the average 1999-2000 consumer PC hardware.

    Yeah, besides that. Besides the most important determinants of overall system performance, which involve the most difficult design challenges, it's just like a PC. That's like saying that besides the armor plating and the big gun on the front an Abrams tank is just like the family car...in other words it's a totally meaningless statement.

  • 48K is just tolerable, but it can only output 16 bit digital audio? That's poor- for a bit of perspective my UberGeekyProjectStudio (see above URL for examples of its work) will do 8 output channels, 48K, at _20_ bits. (It is a second-generation ADAT studio) If you've worked with high resolution analog, 20 bits is just becoming seriously acceptable- 16 bits is a poor joke, there just isn't the subtlety for professional mastering work.

    It would at least be helpful if someone could get info on the thing's internal bus bit-depth: if _that_ is 16 bit too, we're talking 'The Horror! The Horror!' time. Even a sound-tracker I've occasionally worked with (PlayerPro) has gone to running internal DSP busses at 32 bits and dithering to 16 for output- outputting 20 or 24 bit would be better still for professional work. The bar's being raised pretty damn rapidly when a ill-funded audiogeek like me can work in 20 bit analog mixing, all ready to feed DVD-audio at full resolution. If this GScube is not capable of doing better than 16 bit output that's a major problem. Can't it at least do 8 outputs at 24 bit by storing the least significant bits on dedicated tracks? Hell, my _ADAT_ can do that (or could if I had two of 'em :) )

  • Oh man, this is the kind of box that will change the industry forever...

    Sure it's difficult to program, but with that much CPU power, theres no way this baby isn't going to get supported.

    This will be used initally to generate virtual sets, and probably in the sports-titling field.

    It will also find immediate application in the film industry - George Lucas will be able to interact with a guy playing jar jar binks and see him as jar jar binks in realtime... I doubt it will make Star Wars 5 a better movie after the absymal 4, but hey.

    Those IMAX movie presentations will immediately be converted to use this as rendering horsepower, and i can think of a whole lot of other applications.

    I honestly can't believe people are looking at this machine and saying 'it sucks cos it doesn't have enough colour depth' or 'it sucks because it only does 16-bit audio'...

    Not enough colour depth? then write your own 128-bit graphics pipeline, implement it in software, and resample to 32-bits only when you come to the final rasterisation stage. You can't tell me that 32-bit colour isn't enough for final render.

    no professional musician or audio engineer is going to use this box as a digital recorder, though it might very well make an awesomely powered DSP engine and synth.. but you'd dump that data over a serial interface to a D/A converter.. if youre spending the money to develop the software to do it, then that extra little step is a drop in the ocean.

    This machine is about making mindblowing 3D graphics applications available at a price point that was unthinkable last year.

    And you have to respect that.

  • And you can do sysadminning with combo attacks and magic spells.
  • And I've just seen a demo of the beast. The guy shows what looks like an extract of the upcoming CGI film Final Fantasy from Square, but then he stops the scene at any point, rotates it in real time changes the lightning with no visible degradation... It's simply *amazing*.

    From what the guy said, their was 500000 polygons on the screen. He also said it was a prototype and that Sony doesn't sell them yet.
  • Damn, I'm slow.

    Anybody know how this thing compares to the SGI machine just named? And are they using these things for the Final Fantasy movie that's coming up?

  • There's a much better pressrelease at http://www.playstation.com/news/press_example.asp? ReleaseID=9573

    However, I would not like to see Linux on this one, and definitely not GCC either. Both suck in their MIPS-implementation. And to the one's who talked about porting Maya etc from X86 to MIPS. Why??? Maya and SoftImage already exists for SGI's MIPS-machines, and that would be MUCH more simple to port =)

    Besides, one company that has already proclaimed support for this piece of hunky cool hardware is Alias|Wavefront, makers of Maya.
  • How much would one of these cost? Hopefully they'll provide PS2 dev tools with one.
  • Just how expensive will it be? Seeing that the P2 will be $300-$400, 16 times that is like $5000. Sheesh, I think I'll pass on this one.
  • Well, if we could generate realistic ray-traced faces on the fly, and couple them to motion-capture devices and good voice actors, we could stop having to hire news anchors based on their looks.
  • One of these links has already been posted above, but if anyone wants a well-written, easy-to-understand-but-plenty-technical introduction to why the Emotion Engine is cheap to produce and a terrible general-purpose chip, but an amazing design for high-quality low-latency 3D graphics, assuming programers can figure out how to take advantage of it...

    This is wrong. The consumer level version of the chip is poor (not terrible) for general purpose use, but this is because it needs to be in a $299 device. So, for example, there's only 8K of data cache and 16K of instruction cache, and the clock speed is a "slow" 300MHz. There's no reason that both of these have to be true in high-end device. In fact, Sony has said that they plan to pushing for higher speed chips.

    Hmmm...re-reading the above I find it humorous that a 300MHz chip isn't treated with reverence. Heck, that was top of the line in 1998. And when you look at the amazing PlayStation games that have been done with a 30MHz processor...
  • I doubt that this will be used for rendering or animating or anything in the traditional sense.

    I see the system more used as an interface to displaying precreated models interacting in an environment. Similar to 3D video games.

    What you should be able to do with this system is load in some textured models, lights and the like, like there are in any game, and have it "hardware render" the frames as you watch. Like any 3D game. The main difference is that now you would have 16x's the power of the PS2 so you could handle more objects, more poly's or maybe larger textures, but that is doubtful.

    The output could be a digital signal that is then read by a system that can display it real time, if needs be, or to store it for later viewing.

    I am unsure why they are going this route but it seems like a no brainer if you have everything you want mostly pre-defined. You can use a package like A|W's Maya and create everything you need, throw it in the PS2 and play with the objects.

    I don't see this being used in a tradition 3d animation sense. The textures and models need to be of a higher quality than what I see the PS2 being able to provide. The motion needs to be key framed so the animator can go back and tweak movement to fit the mood or the sound track.

    You could use this to produce "good enough for web" videos or live content but I see that as being the limit.

  • Oh look it's another cube. How original :)

    That said a nice purple anodised aluminium one would look quite cool on my desk.

    Seriously though it should be quite cool to see what sort of real time content it can produce. I'd like to see more of the entirely CGI environment that real ppl can be placed into. BBC news certainly used to have something similar to that.
  • developers are struggling with the complexity of programming the emotion engine, so how much support and backing will this receive?
  • Can you imagine?
    "They think its sexist"
  • A 295MHz chip that manage 6 GFLOPS. Thats quite good in terms of FLOPS/Hz isn't it?
  • Does anyone know a good resource where I could get the specs on the CPU? is it CISC or RISC? I'd like to see the assembly set for one of them.
  • Remember when it was suggested that PS2 components were potentially useful for things like signal processing and realtime navigation? Is anybody still laughing, or does this mean that just maybe that wasn't so hysterical after all?
  • And so do I, but does that mean that I am not a troll?
  • The meat of this submission was taken from

    http://ps2.ign.com/news/22490.html

  • No, it's something designed to model and render CG movies with. Who in their right mind would use it for a webserver?

    No, it's something designed to work with multimedia. Who in their right mind would use it for a webserver [betips.net]?

  • Someone is going port linux to this thing.
  • The specs give the memory size as 2 Gb (128 Mb x 16) - this implies a cluster of single-CPU modes rather than a single-memory SMP machine.

    $ cat < /dev/mouse

  • Nodes, not modes. Sorry.

    $ cat < /dev/mouse

  • well since most people pay by rack space by the unit then surely it would actually be better to make it deeper and less high?
  • No, not "Computer Graphic Imagery", "Computer Generated Imagery". See? Not redundant at all! The confusion is all in your head.
    --
  • They're Japanese. The Japanese say "CG" almost compulsively. It's one of those English buzzwords you see surrounded by katakana -- CG this, CG that. It's applied to anything from Toy Story to art hand-drawn in Photoshop (i.e. things we wouldn't normally call CGI in the Western world). And by the way, CGI has stood for Computer Generated Imagery since before Tim Berners-Lee set hands on a NeXT system.
  • A shared bus approach with 16 CPU's would not provide enough bandwith... and NUMA just lets programmers use bad code :)

    IMO this machine is much too difficult to master for anyone but the tool developers, and since they are already dealing with so much nitty gritty dealing with the segmented memory directly (and more efficiently) wont add that much more work.
  • Deeper would mean it requires deeper rack to have cabling and airflow space. And because many computer cases are about the same depth, making it rather high than deep means it's going to fit into same racks and the usual PCs.

    Anyway, who's going to put them into some rented server room? They don't seem like webservers to me..
  • What the heck is a Nintendo Star Cube?
  • IIRC, the export restrictions were directed toward 'unauthorized' countries. That is, Sony couldn't ship them (in quantity) to countries Japan didn't approve of. Naturally, they approve of the US, so there is no export limitation to the US.

    In any event, I don't think it was just a stunt, as we have just as many outdated 'weapon' classifications for stuff like crypto and the Apple G4 in the US.

  • Yes, but forget about the volume and magnitude of hardware inside, do the come in those cool and shnazzy cases?
  • by AJWM ( 19027 ) on Tuesday July 25, 2000 @07:01AM (#907500) Homepage
    Face it the best systems for games are the really expensive cutting edge ones.

    Heh, yep. About two decades ago when I was working at Concordia University we got in a brand spanking new VAX 11/780, complete with a Norpak graphics unit for some mechanical engineering project, and an A/D - D/A converter for some speech recognition project.

    Starting from a simple program to put a shape on the graphics screen and move it around, it wasn't long before I had multi-player "Vaxteroids" running on it: input was by keyboard from the various VT-52 terminals in the room, display was on the central large monitor connected to the graphics box (not card!), and sound was via an amp and speaker I'd rigged up to the D/A outputs. The whole thing written in Fortran, running under VMS no less.

    Not quite up to the likes of Galaxian that was hitting the arcades about that time, but better (IMHO) than original Asteroids and multi (up to 4) players (kind of a cross between Asteroids and Space War).

  • You know, I still play Quake on my workstations. I have 2 Dell Workstations and 1 SGI NT machine. Just because it's priced as a graphics worksation doesn't mean you can't enjoy all that extra power. :) BTW, SGI NT boxes suxor for playing Q3. I get much better performance out of my Voodoo2.
  • Since Maya and SoftImage seem to have Linux versions in the mill or already out, I'm pretty sure that if they're 64-bit clean (And I'd be surprised if this bunch didn't already have their code that way!) it'd be a simple matter to recompile for MIPS from x86 and have a Linux version of their apps ready to rock and roll.

    Yes, they haven't released which OS the GScube will run, but Linux would be a front runner- free, already runs on MIPS anyway, already has a application base that could be fairly easily moved over (and the companies in question would do that in a heartbeat if they thought this machine would take off...).
  • One of these links has already been posted above, but if anyone wants a well-written, easy-to-understand-but-plenty-technical introduction to why the Emotion Engine is cheap to produce and a terrible general-purpose chip, but an amazing design for high-quality low-latency 3D graphics, assuming programers can figure out how to take advantage of it...

    start with these excellent articles from ArsTechnica [arstechnica.com]:

    Emotion Engine overview [arstechnica.com]

    Comparison of the EE's rendering process to that of a typical PC + graphics card [arstechnica.com]

    The second article is, IMO, the particularly interesting and relevant one, since the approach to rendering taken by today's high end graphics workstations from SGI et. al. is more similar to the PC + graphics card way of doing it than the EE way of doing it. Or rather, the PC + graphics card way of doing it was copied from the workstation approach. Of course, the major problem spot of the PC + card approach to rendering--the horrible bandwidth from the motherboard to the graphics card (the AGP bus is a joke compared to what would be required to actually stream textures into the graphics card in real-time; as it is now, entire levels must be loaded into the graphics card memory and stay there until the next level is accessed)--is not such a problem on a high-end workstation. It'll be quite interesting to see how this GScube thing compares, but the specs are there for it to make a very inexpensive and powerful alternative to the standard SGI stuff.
  • Carmack gives a good explanation [stomped.com] on why wee need more than jsut 32 bit color.

    Basically, he's saying that with higher fill rates, more multipass rendering is possible (dispaly resolutions aren't going to increase to match by that much, and getting 5000 FPS is kind of a waste, so more multipass rendering is the most effective use of it), but with that many more passes comes more error, even at 32 bit. It may not matter now on most consumer cards, but on a machine like this it's going to show up if you push it to its limits.
  • You'd shell out that much for a well stacked dual CPU machine- this sounds like something that'd crush all but the most expensive (say $10K and above) machines. If that is, in fact, the case, $5k's a bargain.
  • Okay. Apple ties itself strongly with a cube form factor in a matter of days. (Yes, NeXT and Cobalt, but their target market audience was/is decidedely smaller than Apples. More general consumers than people looking for high-end workstations or mini-servers.) Sony releases this GScube. Groovy. Let's see...

    Why not just take an Apple form factor, add a GS and call it our own product? GScube. Transpose: cubeGS. Rhymes with: IIgs.

    :-)

    ----
  • Ahh, enlightenment. Thanks to all for clarifying this, that does make much more sense.
  • by Chris Johnson ( 580 ) on Tuesday July 25, 2000 @06:00PM (#907508) Homepage Journal
    Matt, you're a bit out of date- comparing max signal with what the theoretical minimum noise should be is not a useful measurement :) the trouble is, most forms of digital audio (including Red Book CD audio) are _linear_ encodings. Sony has some kind of DVD-audio scheme going where it encodes phase change of the waveform's angle- they are alone in this, it's linear encoding all the way for pretty much any form of digital audio you care to name in use today.

    This is simply not finely grained enough for professional use. Do some calculations- first, you know what 8 bit audio sounds like? Familiarize yourself with how bad that sounds and how grungy it is. In 16 bit linear encoding, 8 bit sound is present at a volume level of 0.39 percent of the total volume of the recording (less than a hundredth of full volume). This does not sound significant, but check that out in _db_- volume is _logarithmic_. In decibels, that 'grunge zone' with eight bit resolution is not 96 db down, not 80, but around _50_ db down. 50 db and 96 db are pretty damn different, aren't they? 50 db is within the range of any junky thing with speakers on it.

    When you say that 16 bit is 96 db and 96 db down is 'the quietest sounds you can hear' you are conveniently overlooking the fact that those 'sounds' 96 db down are _one_ _bit_. One bit is not 'sound'. One bit is old PC speakers or music played over the one-bit tone generator on old Apple IIs. I would argue that 8 bits is not sound either, but cheap noise to send over the web or something- and if you listened to 8 bit audio even at 44.1K you'd likely agree. Yet that 8 bit zone is in _all_ 16 bit recordings, a mere 50 db down. Anything around 50 db down is being represented by merely 8 bits... hell, 12 bits is still noticably compromised and that is a mere 20 db down. This sort of thing is not acceptable for professional work- hence the amazing and deeply needed proliferation of 20 and 24 bit devices, and of internal busses of DSPs running at 24 or 32 or even 48 bits or more, for doing calculations without losing everything to rapidly accumulating bit error.

    20 bits, 24 bits are not perfect, but they are a hell of a lot better. A 20 bit unit like my ADAT is running around 12 bits where a CD would be down to 8- which is enough better (it's 50 db down after all) to leave little room to gripe. 20 db down, the 20 bit unit is at 16 bits. If you have a fully 24 bit unit, it's going to have 16 bits available a whole 50 db down, and get very close in practice to what you mistakenly believe for 16 bit digital audio- that you'll get infinitesimally faint sounds recorded and sounding convincing and believable. There's no way you're going to do that with 50 db down having all the resolution of a Sun .au file, but it doesn't take all that many more bits to fix matters.

    I should thank you for inspiring me to hunt down these various formulas and tables and to work this out mathematically- I didn't realise it was quite as bad as it is! :) It's a piss-poor engineer who can't get 50 db of dynamic range out of his recordings if he tries :)

    When you are looking at your specs, use these numbers for a reference of what the maximum signal quality is for various bit depths (given linear encoding these are the SAME NUMBERS as Matt gives- but ouch, when you get a sense of what it actually means in practice!)

    • 16 bit= Sun .au 8 bit zone, 50db down
    • 20 bit= Sun .au 8 bit zone, 72db down
    • 24 bit= Sun .au 8 bit zone, 90db down
  • Its called SMP. Maybe you've heard of it? 3D rendering is hideosly parallizable. You've got a few hundred million vertecies laid out in memory, and you can process them all in any order. Its a cinch to divy that up between the processors. Then, after geometry, you can do a lot of the lighting in parallel. Or if you're doing raytracing, you could assign a ray to each processor. As for rendering, you can simply do the SLI type thing 3DFx does. I mean putting many rendering and geometry engines together is nothing new. Just take a look at some of SGI's RealityEngine machines.
  • What do they mean by real time content generation?

    For starters, much of what Avid [avid.com] does. Real-time effects for news and sports shows, for example.

  • This is wrong. The consumer level version of the chip is poor (not terrible) for general purpose use, but this is because it needs to be in a $299 device. So, for example, there's only 8K of data cache and 16K of instruction cache, and the clock speed is a "slow" 300MHz.

    Have you read the articles I linked to?? I'd guess not.

    If you did, you'd find that the EE is very strongly optimized for SIMD and not at all for general purpose computing. In effect it's sort of like an AltiVec latched onto a 68030 core instead of a G3/G4.

    Hmmm...re-reading the above I find it humorous that a 300MHz chip isn't treated with reverence.

    I wasn't dissing the clock-speed. Indeed, I wasn't dissing the chip at all--I was defending its ability to go up against SGI machines, for cryin' out loud! I was simply pointing out that it was designed to handle 3D graphics, not as a general purpose chip. If you think that's an insult against the EE, then you don't know very much about it.
  • Its called SMP. Maybe you've heard of it?

    Ohhhh, so that's why those two penguins are there when I boot up. I always thought it stood for Supports Multiple Penguins. Silly me.

  • by moller ( 82888 ) on Tuesday July 25, 2000 @04:47AM (#907514) Homepage
    The difference between GScube and some of its predecessors in this field of computer equipment seems to be that GScube is being created with real-time content generation in mind. The overall plan seems to involve generating content with this development system, which is then streamed from a powerful server to viewers downloading it via broadband Internet connections.

    What do they mean by real time content generation? Is the article just throwing out buzzwords or do they think that someone is going to be creating something on one of these boxes and simultaneously uploading it so that it can be streamed from a "powerful server?" That would seem a little hard to swallow. Are they trying to say that these things can render so quickly that they will practically (or actually) be able to pull off real-time raytracing as a developer is creating something, so once he has finished he can send it to the server to be downloaded to everyone lucky enough to have a high-bandwidth connection?

    mmm...real-time raytracing. that would be something.

    Moller
  • >Carmack gives a good explanation on why wee need more than jsut 32 bit color.

    That's a good pointer. Thank you.

    What he's talking about is basically roundoff error that occurs when multiple operations are applied to a pixel. He may have a point. OTOH, there's a little voice in my head that says the accumulation of roundoff errors is not really inevitable but is really an artifact of exactly how and in which order those operations are done in a "traditional" rendering scheme. Obviously it's going to be even worse with something like 3dfx's fractional-pixel FSAA scheme. Maintaining higher-precision color information, at least in some parts in the rendering process, may be the easiest solution, but I think we also need to consider whether this accumulation of roundoff errors is a sign that traditional rendering methods are headed down the wrong path entirely.
  • by codemonkey_uk ( 105775 ) on Tuesday July 25, 2000 @05:25AM (#907523) Homepage
    What do they mean by real time content generation?

    ...
    My take on it was that rather than it rendering frames and storing them on a server which pipes to the end user, it creates frames as they are requested. This would permit things like product placement to be placed in on the fly. A popular piece of content might have different adds in the background each time its watched.

    Or as another poster suggested, put to better use for things like weather reporting.

    Thad

  • "strategic vision for the evolution of computer entertainment in the broadband network era."

    To me, the most entertaining part is building the broadband network infrastructure itself - once that's done, it'll just be a pipe for all the same old commercials, spam and asst. garbage that clogs up AM radio, TV, Cable, telephones, snail mail, email, search engines, etc. Ya know, business as usual :))

    Have a fun day....
  • I'm just wondering whether Sony will again attempt to make another secret deal with the Japanese government so that the GScube will be classed under the same paranoid export laws [slashdot.org] that was initially enforced on the Playstation2 because it may be "reversed engineered by terrorists into a weapons launch/flight control platform".

    It is believed by many that Sony had in fact asked it's government to do this in order to:

    1. Create enormous media hype. This of course happened very successfully and even had its own Slashdot story!
    2. Prevent people from legally exporting amounts (larger than 2?) of PS2 from Japan and into a foreign market where Sony hadn't officially launched the product. Most notably, the United States.

    On the other hand, considering that:
    "The GScube prototypes are powered by 16 Emotion Engines and 16 Graphics Synthesizers. Yes, this is 16 PlayStation2s rolled into one."

    and thus has considerable processing muscle, wouldn't it be extremely ironic if the GScube is placed under strict export laws by the Japanese government because it is a mini-supercomputer that can become a "threat to world peace by 'rogue states'." Now wouldn't that be a hilarious thought?

  • by FattMattP ( 86246 ) on Tuesday July 25, 2000 @05:32AM (#907527) Homepage
    So what software is available for it? This is going to be about useless unless there is some useful software ported to it. I doubt that we are going to see Maya [sgi.com], SoftImage [softimage.com], or any Discreet Logic [discreet.com] tools on it anytime soon. Granted, big companies like Digital Domain and ILM can dedicate some programmers to porting in house tools, but why would they want to? For the same amount that it would cost them to pay for the developers time, they can buy a couple more SGIs. I'll be curious to see what the future brings for this product.
  • This "GScube" is a cheap knock-off of the "G4cube" [slashdot.org], which is in turn a cheap knock-off of the "borGcube" [netmoon.com]. Everyone knows the Borg assimilated the graphics community a long time ago.

    -- Dave

  • by be-fan ( 61476 ) on Tuesday July 25, 2000 @10:00AM (#907531)
    First things first. They probably won't run Linux on this thing. Even if they do, it will be either
    A) Heavily tweeked or
    B) Simply serve as a host for some special access libraries. This is a very parallel machine with a lot of quirks (graphics RAM divided into 16 32MB chunks for example) (read the article about the difficulty of programming the Playstation 2 earlier on /.) and to get full performance out of it, its going to take specialized software. Not only are most off the shelf apps just not designed for 16 way operation, but none are designed with the peculiarities of the PS2 hardware in mind. More likely than not, Sony will probably get some companies to port special versions of their software and more likely than not will use a custom OS for the machine. (Though the choice of OS really doesn't matter.) As for performance, specially designed software will scream on this machine. 3D is a very easy task to split up between multiple processors and thus, the 16 way architecture will result in at least 10X the performance of a PS2. To put that into perspective, a PS2 is a good deal more powerful than a GeForce2 GTS, probably on par (or exceeding) Intergraph's Wildcat 4210 (the fastest PC based OpenGL accelerator.) Now at 10x that performance, you've got a machine easily capable of trashing most SGIs. Its not a Reality Engine, but if Sony can pull this off in a sane price it should be quite a machine. Secondly, has anybody notice the amount of embedded RAM on this thing? 32MB per chip! Not only is that one hell of a jump from 4MB on the PS2, but I'm wondering how they got a manufacturing process to handle all those transistors without having 1% yields.
  • Its stupid to use BeOS in its current state as a webserver. It can be done (especially since BeTips is running the experimental BONE networking) but why? BeTips simply runs the site on BeOS, because
    A) That's his personal machine and he uses BeOS on it, and
    B) He wants to show support for the platform.
    The GScube is not designed for web serving. Its got small caches and unspectacular integer performance. Additionally, you're essentially wasting the 512MB or graphics RAM and the 16 graphics processors. Its like buying an SGI Infinate Reality to do webserving. Sure its doable (maybe it won't be, the GScube might use a custom OS) but you'd be wasting your money on the additional graphics hardware, and you'd be getting less performance than you would have from a much cheaper machine that was designed to handle webserving.
  • by Lonesmurf ( 88531 ) on Tuesday July 25, 2000 @04:55AM (#907539) Homepage
    Gee Ess Cube?
    Gee Scoob Eee?

    Hey! Scooby snacks for everybody!

    Rami James
    Guy who fried his brain today. (Must.. go.. home!)
    --
  • The campaign for toriodal computer cases begins here
  • Quite possibly, this could be used to generate images based on sensor inputs, using the collected data to manipulate models in real time and then output it for rendering simultaneously. Weather data could be handled this way, or synthetic newscasters like the ones ZDTV has been tinkering with.

It is easier to write an incorrect program than understand a correct one.

Working...