Massive Storage Advances 279
pra9ma writes: "Scientists from Keele University, in England, have suceeded creating a system that enables up to 10.8 terabytes of data to be stored in an area the size of a credit card, with no conventionally moving parts. This along with 3 other forms of memory which could
revolutionize storage. The company said the system could be produced commercially within two years, and each unit should cost no more than $50 initially, with the price likely to drop later. "
I'm unconvinced about their compression algorithm, but if it
works, this is gonna be amazing.
sure (Score:1)
Re:The compression algorithm... (Score:1)
I, too, can offer this technology (Score:1)
Stop posting ridiculous stories like this, and you will save terrabytes in bandwidth and storage requirements for all the "you've been had" comments.
Isnt it impossible? (Score:1)
11010111 10110100
Those are our two bytes.
How would you record the difference?
The 'difference' would be:
01100011
Now how is that take up any less space? Just a little food for thought.. no?
OH MY GOD! (Score:1)
Nothing like technology induced orgasms... yummy
Text Obsession (Score:1)
Napster frigging alternatives.. (Score:1)
Re:This may never happen . . . (Score:1)
Re:Nonsense (Score:1)
--
Serious implications for Vendors if true (Score:1)
Just imagine what this would do to firms like StoreageTek if true. It will wipe them out. Why spent £0.5M on a Powderhorn solution (~400TB) when you can spend £2000 on 40 credit cards that can hold the same.
Macka
Re:Mmmm.. Hype! (Score:1)
Sure, these things aren't vital, but they're certainly not useless.
... (Score:1)
--
Unlikely (Score:1)
Re:[ot] Your sig is US-101k-centric (Score:1)
Re:This may never happen . . . (Score:1)
he said barney miller
heh heh...
Huge storage just around the corner! (Score:1)
Re:Mmmm.. Hype! (Score:1)
Kevin Fox
Re:The compression algorithm... (Score:1)
what about IO speed? (Score:1)
Re:Nonsense (Score:1)
So to repeat what every other poster has already said, which putz put this story up? I could point you to stories from The Onion that make more sense.
I see it now (Score:1)
Its crosssection is credit card size, they
didn't specify a height.
Anyone found the patent? (Score:1)
I'm interested to see if it's anything that I have already done. Does anyone have any details on the patents?
It won't replace hard drives... (Score:1)
Don't expect it to replace hard drives any time soon, let alone RAM. 100Mb/sec is pretty slow, compared to, say Ultra-2 SCSI (640Mb/sec), or Ultra ATA/66 (528Mb/sec).
Applications.. (Score:1)
AV recording capacity, 33.34MB / minute, 10.8TB = 323,935 minutes of recording for one 16 layer PC Card storage device. Woo. Fucking! Hoo. !!
Becoming a gargoyle (a la Stephenson) would be a practical reality
One of the major ways to save bandwith with client-server games is to have a large library of precomputed models, textures and animations. If the game ships on one of these cards, it would be possible to automatically pre-render every possible combination of every possible movement and texture of every model in your game to a near photo-realistic level of detail, letting you just transmit an index. And still have space left over for an abridged Library of Congress.
Ever find yourself needing to look up something on the 'net, but lack access? Whip out a Net-on-Card and find what you need.
I just hope it's not vapor. Please, PLEASE, don't let it be vapor.
Re:GOATSE.CX LINK (Score:1)
Re:The compression algorithm... (Score:1)
--
Some more info - but I am still sceptical (Score:1)
But the cache once again comes through: here [google.com]. However, it's still light on details, though it does mention that the Prof is Professor Emeritus of Optoelectronics at KU, and that his "main focus over the last thirteen years has been the research and development of 3-dimensional magneto-optical recording systems."
It appears that this has been in the news before, as early as September 1999, in The Register [theregister.co.uk]. I can't say that I'm impressed with the other "scientific curiosities" they mention CMR promoting, like "Zodee," the "disposable toilet cleaning device which avoids the hygiene problems associated with conventional toilet brushes."
And now that I look closer, it seems
See also Unitel, Inc. [unitelnw.com] They claim to be developing HOLO-1, "the first practical quantum-computing device, which can be economically manufactured and introduced into the current computer industry." The esteemed Prof. is listed in their subcontractors section complete with picture.
-- gold23
Re:The compression algorithm... (Score:1)
Your source file is probably generated by Microsoft. It is not unusual to see MS filesizes that are 100 to 1,000 times larger than the actual text. The padding can contain long strings of zeros, which can be highly compressed.
Try compressing a text file generated in a plain ASCII editor. You might get different results.
Re:The compression algorithm... (Score:2)
god what happened to slashdot? (Score:2)
if he'd laid it on any thicker I would've suffocated. maybe the reason we have people posting crap about CmdrTaco raping animals is because 95% of the people here can't understand more subtle criticism.
...and now even ShoeBoy is no more....time to nuke the site from orbit, I think
Storage space per square cm (Score:2)
Tighter storage media also needs to safeguard the data on it better. Heaven help us all when we back up all our word processor documents to a tenth of a millimeter and a fly sneezes on it.
Re:This may never happen . . . (Score:2)
Reminds me of the TCAP.... (Score:2)
All that, and they packaged it in a Pentium II case!
---
Fucking Cookie Spam (Score:2)
Fortunately, I use Opera. It alerts me and lets me block 'em.
--
Re:10.8 terabytes with today's systems? (Score:2)
Re:New storage ratings... (Score:2)
MPAA must be pissed off.
= 17,400 CD's (presuming 650MB per CD)
So is RIAA.
---
Patience?! (Score:2)
However I agree that a lot of people, including Slashdotters, seem to think anything that isn't out now is vaporware and anything that isn't likely to be out in less than two years is sci fi. Maybe it'd be nice if having an idea made it suddenly materialize but unfortunately a lot of it takes work and that takes time. That doesn't mean the idea is vapor, just that it isn't ready to slap in plastic bubble paper and post mark to every Sam, Dick, and Mary that knows how to order from AmazonSucksAwayMyCash.com.
Re:Nonsense (Score:2)
Yeah, mp3s are pretty unremarkable too...
Cheers,
Tim
Copy Protection, copyright, etc.... (Score:2)
I think it might be important that we get copy protection/copyright issues resolved before these new storage technologies arrive.
As more proprietary stuff is produced -- and if it has killer-app serious storage capabilities -- several things will happen:
1) people will realize that they can store all the movies they want to watch and trade on their peer-to-peer networks
2) The media bullies of america will realize this too, and rather than develop a new business model and adapt, will demand draconian restrictions.
3) It will be easier to slip the "protection" mechanisms into the emerging proprietary technologies....
Bottom line: we need to make sure the issue is resolved sooner rather than later.
--
Mmmm.. Hype! (Score:2)
Funny, I don't think of PDAs and cellphones as requiring large amounts of memory. My PDA has 2 megs, not 10 terabytes. My phone has about 32K, not 32 trillion K. Yet both seem to do their jobs pretty well...
Besides, cellphones, by definition, have wireless connectivity. What do they need gigs and terrs of storage for?
Kevin Fox
Re:YOU=MORON (Score:2)
Hey. That sortof looks like a compression algorithm.
I wish I had moderator points left over... pointing out that he's an idiot (besides the flaming profanity) is the best I can do.
Re:The compression algorithm... (Score:2)
Surprise and information are related somehow, but I can't remember how, and I think my text on this is at work right now. So, if someone could fix that, I would be quite grateful.
How's that for a first on Slashdot?
Compression has everything to do with *math* (Score:2)
And, for your information, entropy means exactly the same thing in compression as it does in physics: it's the total information available in the system. You can't compress something past its entropic limit without the compression being lossy. (There's no physical analog to information loss without black holes being involved, and even that's questionable).
What you're talking about is the fact that there are only "generic" compressors, rather than "format-specific" compressors - they treat all data as random byte-strings without any structure. I don't know if any 'format-specific' compressors exist: it seems that any useful program would have to include a whole lot of 'format-specific' types, and the main problem is that the things which store huge amounts of space *are* essentially random patterns of bytes.
No one's really worried about their text files filling up their hard drive. Honestly, I think if you can find a kind of file which compresses poorly via standard methods and takes up huge amounts of space, I'd be surprised. Video? Already have video-specific compression. Images? Already have image-specific compression. Audio? Already have audio-specific compression.
The problem with standard compressors nowadays has nothing to do with the method they use to compress: some extremely smart people are working on this, and they've found that this analog is exactly true - information theory works. Period.
Re:The compression algorithm... (Score:2)
It wasn't a bad example - it was a 'curious' example, because Hawking radiation is a 'curious' process. It is anti-entropic, at least to our understanding of it. Granted, we don't have a happy black hole to play with in the lab, but... who knows?
Hey, after all this, someone actually might be able to figure out some way to generate a better-than-Carnot engine using a black hole. Its cycle time would probably be insanely high (of order billions of years).
My personal guess, however, is that we're all smoking crack and there's quite a bit more to Hawking radiation than we think due to other effects. For instance... how exactly does the area of the black hole change upon emission of a Hawking quanta? Does it change perfectly radially? It can't - that would violate causality. It has to 'ripple' across the black hole - this will cause a gravitational wave as well. It's possible that this gravitational wave may contain additional information/entropy as well. I don't believe in perpetual motion machines.
By the way, I will "make that argument", and if I use a word in a way that's counter-intuitive, sucks to be you. Change your intuition. But, as others have pointed out, I'm not using it counter-intuitively at all - entropy is information. Period. End of story.
Re:The compression algorithm... (Score:2)
No /conventional/ moving parts, 1 cm /square/... (Score:2)
---
Re:Nonsense (Score:2)
Sound to me like a highly indexed hash table, with a large token space
by comparing each word with its predecessor and recording only the differences between words
Not enough details there, but 8:1 compression using a token/hash scheme sound reasonable. I've heard that web search engines (altavista, google, and their ilk) use a similar algorithm to obtain between 10:1 and 20:1 compression on web texts, since there is so much redundancy in web pages. Since most pages have identical lengthy string sequences (trashed slightly because I haven't the energy to figure out the
Since I work with a lot of already compressed data, I discount any media compression claims. I'd avoid any storage media which incorporated hardware level compression, because it would eventually lead to problems. Real databases maintain their own raw partitions on disks, since they can create a highly efficient file system for their own purposes. When the hardware starts returning varying free space results because compression isnt working, DBs either fall over hard (sybase) or fill the logs with errors (oracle).
The magneto-optical-fluid disk sounds like they have a laboratory sized research project they hope to reduce to the footprint of a credit card, but they neglect to mention it towers 208 inches high
with no conventionally moving parts
Whenever something sounds like a marketing press release, with modifying adjectives like conventionally, it pays to be skeptical, the forte of slashdot.
the AC
Re:GOATSE.CX LINK (Score:2)
Re:The compression algorithm... (Score:2)
Or is that what LZW is? in which case what is the name of that sliding window approach?
Re:The Company's Press Release states... (Score:2)
EITHER some of the plaintexts must compress to the same shorter bit string (in which case how do you choose which one to decompress to) -- two pigeons in the same hole
OR some of the plaintexts don't compress at all.
So in either case, the press release is drivel. Caveat investor.
It's a global reference (Score:2)
A serious question. (Score:2)
Re:It won't replace hard drives... (Score:2)
A Short History of Keele (Score:2)
Keele University
The University College of North Staffordshire was founded in 1949 to become the University of Keele in 1962.
There was a deliberate aim to break away from the pattern of the specialized honours degree, avoiding as far as possible the divisions between different branches of study. Consequently, most students read four subjects in their degree course, two at honours level and two at subsidiary. At least one of these subjects must be from the arts or social sciences, and at least one from the natural sciences.
Many students have taken a four-year course, beginning their studies with the Foundation Year, in which they follow a broad course covering the development of western civilization.
Most students live on campus in halls of residence or in self-catering flats, and many staff also live on campus.
The Keele Estate
The University is situated on an estate of 650 acres, with extensive woods, lakes and parkland, formerly owned by the Sneyd family.
The Sneyds can be traced back in north Staffordshire to the late 13th century, but they came into the posession of the Keele estate in the mid-16th century.
The present hall was rebuilt in the 1850s for Ralph Sneyd (1793-1870) to the design of Antony Salvin at a cost of about £80,000. The grounds and gardens were magnificently laid out around it, and many interesting features survive today, such as the remarkable holly hedge, originally 199 yards long, 28 feet thick and 35 feet high.
At the beginning of this century, the hall was let to the Grand Duke Michael of Russia, who entertained King Edward VII there. Later, however, it remained empty, and troops were stationed on the estate during the second world war.
Publications are available which give the history of Keele in much greater detail:
A book entitled The history of Keele (edited by C.J.Harrison) is now regrettably out of print, but copies may still be found in appropriate libraries.
Pamphlets by J.M.Kolbert entitled The Sneyds; Squires of Keele, The Sneyds & Keele Hall, and Keele Hall; a Victorian country house are obtainable from Mrs D.Warrilow in Information Services (Tel. (01782) 583232).
Off the record; a people's history of Keele by Angela Drakakis-Smith, published by Churnet Valley Books at £8.95 (ISBN 1-897949-21-9) should be obtainable through any bookshop Keele; an introduction by Michael Paffard, obtainable from Chris Wain at the Alumni Office, Keele University, ST5 5BB, (Tel. (01782) 583370) for £3 (ISBN 0 9534157 0 8)
Keele; the first fifty years by J.M.Kolbert, published by Melandrium Books, obtainable from 11, Highway Lane, Keele, ST5 5AN, for £16.95 + p&p. (ISBN 1 85856 238 4)
c.m.wain@uso.keele.ac.uk
conventionally moving? (Score:2)
with no conventionally moving parts
As opposed to, say, unconventionally-moving parts??
Ho Hum... (Score:2)
Now if IBM comes out and says they've found a way to squeeze 10 terabytes into the space the size of a credit card, I'll be impressed.
This ain't news. /.: Please fix editorial process! (Score:2)
The blatant reposts are getting annoying.
This technology will be ready for market in two years? Is that two years from now or two years from when an almost word-for-word identical article about this [slashdot.org] was posted a year and a half ago? =)
Re:The compression algorithm... (Score:2)
is that they've gotten 8:1 compression on *unicode* english ASCII text.
Uh, wrong (Score:2)
(Note: character != byte. That is only true of ASCII characters. If all you wanted to do is represent the 26 English letters it would only take 5 bits per character. We're talking language here, irrespective of representation.)
Go read a good cryptography book and straighten out your terms and definitions.
Re:The compression algorithm... (Score:2)
Yeah, right. (Score:2)
Using a liquid between the read/write head and the recording surface would help the optical coupling between the surface and head, but creates a whole new set of problems. Probably puts a ceiling on media speed, for example. A whole set of mechanical problems have to be overcome to turn that into a commercial technology. Whether it's worth the trouble remains to be seen. For a 4X improvement in MO drive densities, probably not.
(There's a neat variation on this idea used for scanning photographic film, called a "wet-gate transfer". The film is immersed in a liquid with the same coefficient of refraction as the film base. This makes minor scratches disappear.)
WTF. (Score:2)
I hope they don't give anyone the wrong impression (Score:2)
wristwatches could have vastly more power than today's PC Computers.
Can nothing about technology go without being tainted with sensationalism. I am not even going to point out why this wrong, as I am sure eveyone realized just how stupid it is without me having to say it.
10.8 terabytes! (Score:2)
Just More Funding Hype (Score:2)
The specific claimns are:
Excuse me while I snort beer through my nose.
Oh! Let me get out my wallet to invest! (Score:2)
The second invention involves a different way of recording and reading information, increasing four-fold the amount of data that can be held on magneto-optical disks, which are used for storing computerised data. The third invention provides new kinds of coatings and materials that can be used in disks, providing a 30-fold increase in capacity.
The fourth and most interesting invention produces a memory system that enables up to 10.8 terabytes of data to be stored in an area the size of a credit card, with no conventionally moving parts.
This is like a total technological troll.
Re:The compression algorithm... (Score:2)
C:\WINDOWS\Desktop>bzip2 -k page.htm
C:\WINDOWS\Desktop>dir page*
Volume in drive C has no label
Volume Serial Number is
Directory of C:\WINDOWS\Desktop
PAGE_F~1 02-13-01 12:17a page_files
PAGE HTM 59,243 02-13-01 12:17a page.htm
PAGEHT~1 BZ2 8,098 02-13-01 12:20a page.htm.bz2
2 file(s) 67,341 bytes
1 dir(s) 3,892.71 MB free
C:\WINDOWS\Desktop>
That's a 7.32 ratio.
Re:500:1 compression can easily be achieved on tex (Score:2)
Riot! I love it, but it's not half as good as my multivariate transaxial parser generator. It can recompile the kernel in 0.2 seconds on my 386.
The accurate story, with access times (Score:2)
So access times are much slower than for a conventional hard disk.
How? (Score:2)
Re:Wow this is GREAT! (Score:2)
Re: (Score:2)
Re:The compression algorithm... (Score:2)
I remember in a recent Information Theory course I did at Uni, we learnt that the information content of an ensemble with 26 different equally possible outcomes is 4.7 bits per symbol.
That would be a very crude way to compress. LZW compression (and similar algorithms such as the one in gzip) find multiple-byte patterns, which are reduced to smaller and smaller bit representations as they occur more frequently. For example, if I had "ABCABCABCABCABCABCABCABC", it would figure out that "ABC" is being repeated and use a smaller number of bits to represent it.
That's why English text can typically be reduced by 8-10:1 compression, because there is so much redundancy in words. Try doing a gzip on a log-style file with lots of redundancy and you'll often see 100:1 compressions.
--
Re:This is unbelievable! (Score:2)
Re:The compression algorithm... (Score:2)
Take a look at Microsoft's latest release - Outlook Mobile Manager [microsoft.com]. The bit of interest is how they compress text [screenshot [microsoft.com]] using their *new* technology Intellishrink.
You can choose various levels of text compression from none, remove spaces/punctuation to remove vowels...
ugh
Re:Nonsense (Score:2)
Well, I just took out a business card, and wrote on the back "The letter 'a', repeated 10.8 x 2^40 times". Did I just store 10.8 terabytes of compressed data in an area the size of a credit card?
Call the press!
This may be impressive, even revolutionary, but we need more technical details.
What a pantload. (Score:2)
Re:The compression algorithm... (Score:2)
Well, most people don't buy massive storage units so they can keep extra copies of the English translation of The Three Musketeers lying around. Data is often much more redundant than that. For example, in my job I routinely deal with very large log files (1GB+). These often compress to 1:100 or better due to the large amount of redundant content.
Uh oh, Professor Emeritus (Score:2)
Flat5
Re:Always the size of a credit card (Score:2)
Back to the question: Why credit card sized? Simple. Its easy to hold either with the fingertips or in a fist, its easy to carry, it would fit in a pocket or a wallet, they are lite enough not to be noticeable. Most people could carry one or two of these to work with no effort. I would like to see you do that with a standard hard drive, or a zip disk, or even a floppy disk.
floating in lubricants (Score:2)
notice the language: no conventionally moving parts... plenty of unconventional movement, though.
Which brings me to my point: how can this invention be aimed at the mobile/palm markets if the read head is floating in lubricants?! here's to hoping they license some skip/shock technology from the walkman crowd...
Some questions I have about this (Score:2)
Really? Just working on the above quote, I do not see much in the way of compression, especially 1/8th in size. It might work for a dictionary, but actual useful text is going to be less similar.
Another question I have is this actually REWRITABLE? I mean, I am reading this and they talk about recording and reading. However, is this write-once/read-many technology (in which case, it would be useful for technical reference)? OR is it write-many/read-many, in which I can upgrade my hard drive to 250x its current size for $50? I suspect it is the former, in which case, it is a nice idea but not as useful at first glance.
Even if it is only write-once, the ability to have 10 terabytes for storage in say a cell phone (even if I cannot reuse the data space) is still impressive.
-FlashfireUVA
what would one do with all that space? There isn't enough porn or music to actually download
Re:This may never happen . . . (Score:3)
Oh, get real. Both you and I know that by the time this technology (if it's real) makes it to market a standard OS install (take your pick, it won't matter) will be 5TB, using up half of it right off the bat. I, for one, will not be looking forward to buying Linux Kernel Internals -- 33rd printing, volumes 1-53.
And, in ten years, I'll STILL be on a fucking 56k-when-hell-freezes-over-more-like-26.4 dialup while Suzy N'Syncempeethrees and Sammy Likestoforwardjokes III have blistering Ultra-DSL at 30Gbps. Grrr.
Sorry for the rant.
Re:The compression algorithm... (Score:3)
As to my colleague, he'd read virtually all the published literature in the area, and he's a pretty smart cookie (he's now on a PhD scholarship at Princeton working with people like Tarjan). I think the thing I learned most from his efforts were that text compression is in a period of diminishing returns for improved algorithms - they're not likely to get much better.
Re:The compression algorithm... (Score:3)
While it is correct that studies with humans have indicated that English text has about one bit of entropy per byte, suggesting a natural limit of about 8:1 compression, humans have the use of a whole lot of semantic information (they understand the meaning of the text and can therefore predict words based on that) that no compression algorithm I'm aware of has used.
I'm taking this with a large grain of salt, thanks.
Re:The compression algorithm... (Score:3)
OK. A little background on information theory for you - you know, from Shannon, back in the early 1900s, I believe, though correct me if I'm wrong. There is an object in information theory called the partition function of an experiment - it is essentially the chance of getting any result from that experiment. There is then an object called 'information', which is proportional to the log of the partition function. The lower a chance of getting a result, the higher the information content gleaned from that experiment. For instance, if you had a box full of quarters, and you randomly pulled out a coin, the partition function would be (quarters, dimes, nickles, pennies) {1,0,0,0}. The information content of that experiment is klogW: (0,inf,inf,inf)- you don't learn anything if you pull out a quarter. You knew there were only quarters to begin with. If you get any of the other ones, damn, you're surprised.
What does all of this have to do with entropy? Well, in thermodynamics, which is, you know, where the term COMES from, entropy is klogZ, where Z is the partition function of the system: essentially the same thing. k is Boltzmann's constant - it comes from the Celsius temperature scale.
So, here's the news flash: Entropy is information. Period. Therefore, he was using the term CORRECTLY, not INCORRECTLY. Entropy is USEFUL information, not USELESS information. Guess what? This is the same in chemistry, too. The universe doesn't care whether or not you can use energy for work, and entropy has nothing to do with 'randomness'. A 'random' distribution of matter in a universe will collapse to a 'nonrandom' sphere, thanks to gravity: if entropy was randomness, then the universe would have just violated the second law - it went from 'random' energy to 'nonrandom' energy. (mass=energy, so don't even try it)
Entropy is information. Period. Hence the second law of thermodynamics- entropy increases because the information content of the universe is increasing. If you doubt me on this, here's a simple bit to convince you: you have a system which goes from state 1 to state 2, both of which have the same entropy. Therefore, there is a reversible process which connects the two states, which means that one can go from state 1 to state 2 and leave no tracks inside the system that the change had happened - i.e., the information content of the system is static.
I'm really getting sick of having to explain this constantly - I wish they would never teach entropy as 'useless energy' or 'unusable energy' - like the universe cares whether or not something can be used for work.
The link between information and entropy is entirely well known and extraordinarily important. For instance, if an object falls into a black hole, is there no record of its existance anymore? Is all the information that was inside that object lost? No - a black hole's area is related to its entropy, which increases with mass. Therefore, the 'information' (as far as the Universe cares) in that object is now somehow stored in the black hole's event horizon. Curiously enough, an object which falls into a black hole is, from the outside world, constantly getting infinitesimally closer to the event horizon. This is a weak argument, yes, and changing a few words could make it stronger, but this is offtopic, so I don't care.
In closing - you're wrong. Entropy is useful information. 1 bit of entropy out of 8 means an 8:1 compression ratio. Here, you've 'extracted' 7 bits of 'work' out of the system. The remaining 1 bit of entropy cannot be removed from the system, as entropy can never decrease. (or in this case, cannot decrease without destroying the system)
Re:Nonsense (Score:3)
_____________
The compression algorithm... (Score:3)
Re:hmmm (Score:3)
--
More Info (Score:3)
KEELE HIGH DENSITY LTD
UPDATE - November 2000 During 1999 Keele High Density Ltd. (KHD) announced that it had developed a very high density memory system capable of holding 2.3TB of memory in the space of a credit card. Further work since then has resulted in some significant upward changes to both the capacities previously stated and to the applications the KHD technology addresses. Some of this work is continuing, and there are further patent applications to be filed. The information available publicly is necessarily restricted until those patents have been filed. The very high data densities are achieved through a combination of many different factors - some relating to the physical properties of the recording media, and some to the way of processing and handling data. The physical memory system is a hybrid combination of magneto-optics and silicon. The KHD memory system is applicable to both rotating and fixed media, and is not dependent on the laser-based media-addressing system used. Following the work undertaken since last year, the following data capacities are achievable: a) For rotating media, at DVD size, a single-sided capacity of 245 GB using a red laser. b) For fixed media, a single-sided capacity of 45 GB/cm, giving a total capacity of 3.6 TB on the surface area of a credit card, double-sided and using a red laser. Using a violet laser (now being introduced), the capacity at credit card size will be 10.8 TB. In last year's announcement from KHD the primary focus was on the fixed media application, which with a novel form of laser addressing, could be described as 'near solid state' - involving no moving parts in the conventional sense. However, this aspect of the technology will require some further R&D work to bring it to a mass-production scale - although it is believed that this will not present insurmountable difficulties. These constraints do not apply to existing rotating media applications (for example, DVD), using conventional laser systems, and there are no reasons why the KHD technology cannot be implemented within a short timescale - measured literally in months. A major development arising out of KHD's work over recent months, is that the technology achieving these very high data density figures has application not just for memory systems, but will also produce significant enhancements for the transmission and processing of data generally. This means that KHD's technology can achieve an effective increase in bandwidth capacity, because the very high data density properties, which are in addition to those from conventional compression methods, allow so much more data to be transmitted over a given bandwidth. The same advantages are also felt in terms of processing speeds. Work on this aspect of KHD's technology is continuing, but the current calculations show that an effective eight-times increase in bandwidth capacity and processor speed can be achieved. KHD's development represents a fundamental advance in computing technology, with the benefits being felt across many industry areas. Following completion of the patenting position, KHD will be looking to license the technology to companies for mass-production, and for the ongoing R&D work needed to make the 'solid-state' memory commercially viable. The technology has been developed by Professor Ted Williams at Keele University, Staffordshire, England, over a period of thirteen years. PROFILE: Ted Williams is Professor Emeritus of Optoelectronics at Keele University, Staffs, England, and visiting Professor of Electronic Engineering at South Bank University, London. Professor Williams was Director of Research with Sir Godfrey Hounsfield, Nobel Prizewinner, working on the invention and creation of the first NMR Scanner at Hammersmith Hospital, London. He has also held directorships with major international companies. His main focus over the last thirteen years has been the research and development of 3-dimensional magneto-optical recording systems. KHD's licensing and funding arrangements are managed by Mike Downey, Managing Director of Cavendish Management Resources. CMR is a venture capital and executive management company, based in London. CMR has supported the development of this technology. Further information from: Mike Downey Managing Director CMR, 31 Harley Street, London W1N 1DA Tel: +44-(0)20-7636-1744 Fax: +44-(0)20-7636-5639 Email: cmr@cmruk.com [mailto] Web: www.cmruk.com [cmruk.com]Re:The compression algorithm... (Score:3)
a bit light on the details (Score:4)
film at eleven...
New storage ratings... (Score:4)
Anyone want to come up with some other ratings ?
Mark Duell
This is a year and a half old... (Score:4)
Let's see...
$ nc www.keele.ac.uk 80 /research/cmrkeele.htm HTTP/1.0
HEAD
HTTP/1.1 200 OK
Date: Tue, 13 Feb 2001 11:39:15 GMT
Server: Apache/1.3.12
Last-Modified: Fri, 20 Aug 1999 12:16:30 GMT
ETag: "239a2-f60-37bd471e"
Accept-Ranges: bytes
Content-Length: 3936
Connection: close
Content-Type: text/html
Last modified 20 Aug 1999? Not what I'd call "breaking news"... If you don't believe the server date, try this corroborating evidence: http://www.cs.colorado.edu/pipermail/postpc/1999-S eptember/000002.html [colorado.edu]
Why news.ft.com decided to post the story now, I couldn't say...
== Sparrow
Nonsense (Score:4)
Well that's pretty unremarkable. They've written a compression algorithm.
Oh, by the way, they have also invented
If that were true, why are they bothering to even *think* about their text compression algorithm? Fifty dollars a go? Who wants compression? If these people are telling the truth, we are talking about a thousand-fold increase in gigabytes per dollar over the space of two years.
The phrase "no conventionally moving parts" also brings to mind images of really whacky, non-linear moving parts flailing about. What the hell do they mean?
Absolutely no technical detail is given in the article, and as far as I'm concerned, this is yet another false alarm on the long road to entirely solid-state computer systems.
Always the size of a credit card (Score:5)
I just want to know what every tech inventor's opbession is with everything being the size of a credit card. It's not like we are going to fit these in our wallets. "Sure Mr. Tanaka, I have my 20 terabyte database here in my wallet, care to swap?"
I dunno, I just wish technology came in different sizes I guess.
Wow this is GREAT! (Score:5)
Wow 10.8 TB on a credit card, wahooo! What will they think of next? How do I send them guys my money? I couldn't find any address or nothing, but those english 'blokes' sure look like they is gunna go far with this invention - specially that text compression thingy - pretty damned original if I do say so myself. And then that storage mechanism 'no conventional moving parts' - I can't imagine how they got those conventional parts to stop movin, sound like quite a trick.
Anyway, don't you slashdot guys let the criticism get you down. I am with you. Don't listen to them naddering nabobs of negativism. They always persecute the dreamers!
I am looking forward to your next 'Light speed limit possibily violated' post with anticipation.
-josh
Re:Always the size of a credit card (Score:5)
The size of bread slices varies widely from region to region, this prevents multinational corporations from referring to their products as the size of a piece of sliced bread. Although ANSI created a sliced bread standard in 1986 and updated their standard in 1992 to account for the coarseness of pumpernickel, this is an American standard which prevents any companies wishing to sell their product outside of the United States from using it and unfortunately the ISO has been dragging their heels on forming a sliced bread standard, so until the day when we get the ISO sliced bread standard you can expect many more credit card sized comparisons.
Re:The compression algorithm... (Score:5)
lynx http://slashdot.org/article.pl?sid=01/02/13/024025 4&mode=nested&threshold=-1 > slash.txt
(no -source option because this is Slashdot, and as we all know too well, the content is much more redundant than repeating html tags, much, much more redundant)
shelf:~$ ls -l slash.*
-rw-r--r-- 1 stu users 20394 Feb 12 21:09 slash.bz2
-rw-r--r-- 1 stu users 23750 Feb 12 21:09 slash.gz
-rw-r--r-- 1 stu users 93867 Feb 12 21:09 slash.txt
shelf:~$
This gives a ratio of 0.22. Surprisingly, if you feed the same page to bzip2, but at +2, the ratio increases to 0.27, implying that there is more entropy and thus, more information, in higher scoring posts, which of course, we know to be false :)
Perhaps with this firm mathematical footing, /. can proceed to a new chapter in moderation - moderation by bzip2. Articles which receive high compression ratios are marked down automatically. Of course, this would make it possible to earn a lot of karma, simply by posting random garbage. oh wait..
This is unbelievable! (Score:5)