

Grand Challenges For The Next 20 Years 449
terrapyn writes "Infoworld is reporting: 'A group of British computer scientists have proposed a number of grand challenges for IT that they hope will drive forward research, similar to the way the human genome project drove life sciences research through the 1990s.' Did they get it right? What are some other worthy computing challenges?"
Just ONE request... (Score:5, Insightful)
Re:Just ONE request... (Score:2, Interesting)
Re:Just ONE request... (Score:2)
Re:Just ONE request... (Score:2)
Re: (Score:3, Insightful)
Re:Just ONE request... (Score:3, Insightful)
It is a request for ET Engineering Technology.
Re:Just ONE request... (Score:2)
Re:Just ONE request... (Score:4, Informative)
They already have prototypes small enough to power a cell phone, and they're approaching the marketplace. Cost is unknown, but you can expect them to be expensive at first. And if they take platinum as a catalyst, costs will of course stay high.
It will remain to be seen if people will accept carrying volatile fluids around with them, but I'm betting they'll come out with a "clean change" cartridge system that people will like. Just think: no recharging time. A small reservoir will probably allow for a hot-swap of the cartridge as well, meaning not even any down-time.
Next problem?
Rechargeable? (Score:3, Insightful)
Re:Rechargeable? (Score:3, Insightful)
Fuel will probably be available in cartridges that are shaped to fit the manufacturer's equipment. Replacing them will need to be as easy and fast as changing batteries. Don't forget that current fuel cells are designed with on-board cracking of methanol, which allows for liquid fuel rather than having a pressure tank of pure hydrogen. It will make things much more convenient, although at the possible expense of some size/weight, a
Re:Just ONE request... (Score:2, Interesting)
Comment removed (Score:5, Insightful)
Palm vs. laptop (Score:2)
Here's an IT challange... (Score:2, Funny)
Nothing new here (Score:3, Informative)
**yawn**
Re:Nothing new here (Score:2, Insightful)
Set
Re:Nothing new here (Score:2)
Who knows what will happen (Score:5, Insightful)
Re:Who knows what will happen (Score:2)
Re:Who knows what will happen (Score:2)
Re:Who knows what will happen (Score:3, Insightful)
Ubiquitous computing is a
DATA DATA DATA (Score:5, Insightful)
Re:DATA DATA DATA (Score:5, Insightful)
I have users with multi-GB mailboxes that can't quite be deleted, but archiving it doesn't really solve the problem either, it just makes it harder for the user to find what he's looking for.
So, it's a basic problem. Every day, we're generating more data. The amount of data (in bytes) is going up every day, as computers are more easily able to deal with higher resolution pictures and movies. But what do we do with all this data? Just keep writing it to tape and storing it in bunkers? After we accrue enough data, what's the point of keeping it?-- you won't be able to find anything anymore.
It's a real problem for me, both as an IT pro and personally. When dealing with so much data, how do you:
Re:DATA DATA DATA (Score:4, Insightful)
Yeah, but part of my point was, not every grandma has a me to set *anything* up. I don't want to have to build a Unix system and write a custom solution for my grandmother anyway. I am not a one-man full-time tech-support staff for everyone I know. When I talk about a solution, I mean something that comes with the computer or is an easy-to-install add-on that grandma can do herself. I mean something that I can point out to some know-nothing and say "Buy this. It'll take care of your problems."
I suspect that banks will start providing safes for data soon - with some kind access like ssh
For grandma, they'd better have a better interface than CLI SSH. Maybe a program that uses SFTP, but with a nice GUI, but again, I'm not writing my own programs here.
Categories, I put as keys are always fixed for me and I'm getting paths to them immediately without need to make find/grep each time
No offense intended, but you're still spending far more time than I'm talking about. Setting up unix servers with huge raid drives, finding an out-of-state site to stash it, setting up secure data transfers, devising your own method of assigning metadata to files or some kind of personal database file system....
I understand, for a geek, this isn't a rediculous expense of time, since it's also a hobby and a source of fun and entertainment. However, to grandma (and even me) it's just too much.
When I talk about making photos "easy to find", I'm talking "easy" like Apple's iPhoto is still a bit too complicated, in that you have to assign keywords and ratings manually, which many users aren't going to bother with after a certain number of photos.
When I talk about easy to access, I'm talking about the process being relatively transparent, i.e. easier than connecting to an FTP site. Like you wouldn't need to know that it's "not on your computer".
When I talk about affordable, I'm talking about something like $100 total, or a $10 a month service (for personal use).
In case I'm not being clear, I'm not asking, "What's a good, cheap backup solution, available today?" I'm saying, the state of data management technologies is not currently sufficient for our ever-expanding set of data. We need better search methods for all sorts of data (not just text). We need transparent backup and archival methods (transparent both in the backup and the restore). We need more than solutions for businesses who can employ a big staff and thousands in hardware, and more than solutions for geeks who can roll their own. We need solutions so that Joe Schmoe can take digital photos to his heart's content, can create a digital music library as large as he wants, and not need to worry about sorting through the data or losing it.
Re:DATA DATA DATA (Score:3, Interesting)
Of course, many of these proposed filesystems allow for something like, "Give me all my jpg's that are larger than 640x480 and were created later than Jan 1st." So, already, we have more than keywords.
However, I still don't think it's sufficient. If I have thousands of photos, is it really reasonable to expect that I am going to be comprehensive about adding keywords
Re:DATA DATA DATA (Score:4, Insightful)
I agree, we are being buried in data but perhaps that's because the emphasis is on collecting data rather than managing information.
IT will continue to be a benefit so long as we focus on precisely what we're gathering and structuring data for.
Re:DATA DATA DATA (Score:3, Insightful)
You are correct. That is why it's not called Data Technology.
However, I think the key is that people want information and computers store only data. "Data Mining" is the science of extracting a small amount of information from a mountain of data. I guess it's a bit of a misnomer.
Gold Miners mine through a mountain of quartz looking for gold.
I don't know what kind of structures silver is in, but its the same deal, Silver miners are seeking silver.
The last thing Data Mi
Re:DATA DATA DATA (Score:3, Informative)
To imply that we're only just working out what to do with all our information is not quite right because the principles of Knowledge Management are well established - for example one of the often
What are some other worthy computing challenges? (Score:3, Funny)
Re:What are some other worthy computing challenges (Score:2)
And ultimately, it depends on what you want, too.
Personally, I've had great experiences with all of the above OSes. While issues do crop up from time to time, it would be unwise to assume that they would not, in the future.
I'm still waiting for things promised by Y2K (Score:5, Funny)
What a joke that turned out to be. I'm still making calls with an audio-only phone and I have yet to come across a practical hover-car.
Re:I'm still waiting for things promised by Y2K (Score:5, Funny)
Re:I'm still waiting for things promised by Y2K (Score:2)
Re:I'm still waiting for things promised by Y2K (Score:2)
Granted, we're pretty crazy about cell technology here (although not as much as Japan).
Re:I'm still waiting for things promised by Y2K (Score:2)
When that happens, then we will see if it's something that people use.
What about ... (Score:2, Funny)
Cell phones (Score:3, Funny)
Re:Cell phones (Score:2, Funny)
Talk about being married to your cellphone...
Re:Cell phones (Score:2)
Memories for life? (Score:2)
I have data that is still intact from 1980, 25 years ago, because I have taken care to keep copying it to backup media, current media (tapes to CDs to DVD, etc.)
Point being, we can keep data for as long as we're interested in investing the time and money to do it right. Just because some fool can't learn how to backup
Re:Memories for life? (Score:2)
Why would you bend over a toilet? Ok, I don't really wanna know...
put it on the web? (Score:2)
Speaking of simulating life... (Score:5, Interesting)
- sequence the bacteria's DNA right there in the doctor's office (this part isn't really an IT challenge)
- from the bacteria's genetics, determine which antibiotics (out of all known ones) can effectively kill it
- if none can effectively kill it, ship the DNA sequence information off to the CDC's supercomputers, and have them automatically develop a new antibiotic that will kill the bug.
I figure that this is a challenge for the next forty years, not just for the next twenty.
Re:Speaking of simulating life... (Score:2)
Re:Speaking of simulating life... (Score:2)
(Poorly-written Slashdot HTML filter...)
Biggest Problem in that Scenario (Score:4, Insightful)
- Get the patient to take the antibiotic all the way through
That's the crucial missing step that's let the nasty bugs get this far
Re:Biggest Problem in that Scenario (Score:5, Informative)
The problem is that resistance isn't either/or -- that is, it's not as simple as saying a particular strain of bacteria is resistant or it's not. All strains have greater or lesser degrees of resistance; more precisely, individual bacteria within the population have greater or lesser degrees. When you're on antibiotics, the bacteria tend to die off in, pretty much, an exponential decay curve. Once the curve drops below a certain level, the remaining bacterial population is insufficient to maintain the infection; your immune system is fighting the infection too, of course, and it can take care of the remaining bacteria, which are the more resistant ones, one the less resistant ones are killed off by the antibiotics.
So what happens when you stop taking the course of antibiotics halfway through? Well, where you previously had a bacterial population consisting of some bacteria with weak resistance, some with moderate resistance, and some with strong resistance, now you only have the latter two categories. And these are going to continue breeding, and your immune system is going to spend its resources fighting them equally, without preference as to which is more or less antibiotic-resistant -- which means more of the bacteria with greater resistance will survive and grow. OTOH, if you'd finished the antibiotics, only the most resistant bacteria would be left, and your immune system could probably finish them off on its own.
To top it off, resistance requires an expenditure of energy on the part of the bacteria -- you're quite right that many such critters have non-expressed resistance genes already in their genomes; the reason these genes aren't usually expressed is because doing so takes energy the bacteria would usually prefer to devote to feeding and reproducing. So in a patient who doesn't take antibiotics at all, the percentage of resistant individual bacteria is going to be very low. This means that taking half a course on antibiotics is the worst possible course of action: if you take the whole thing, you'll probably end up killing off the entire infection; if you take no antibiotics, you'll either get better or you won't, but either way you won't encourage the formation of a resistant strain.
And the reason that shorter courses of antibiotics are being prescribed is that, quite simply, many newer antibiotics work more quickly. That's the only reason. It has nothing to do with some magical discovery that the traditional ten-day course was longer than it needed to be.
Re:Speaking of simulating life... (Score:2)
The bacteria take up the DNA, which then binds to the gene when it is attempting to make the mRNA to synthisize its protein, thus blocking mRNA formation and killing the bacterium (or at least slowing it down enough for the patient's immune system to kill it.)
Most important goal... (Score:3, Funny)
How about (Score:3, Insightful)
Keeping people employed for more than five weeks?
nonclassical methods (Score:3, Funny)
I know! I'll develop a new type of database that is indexed by the degree to which the primary key sounds either "woody" or "tinny" when spoken. I'll make millions!!
Memories for Life (From the research report) (Score:2)
Vision: applications
There are numerous applications of Memories for Life. In the next 5-10 years, we expect that the most progress may be made in systems that help people retrieve and organize their memories. For example, such a system might help a person find all memories, regardless of type, about his or her holiday in Germany two years ago; and also help organize these memories by time, location or topic.
Nice for someone who has Alzheime
How about this? (Score:3, Interesting)
Re:How about this? (Score:2)
Re:How about this? (Score:3, Interesting)
Human: Can you go get me some food? ALICE: Sorry my body isn't attached right now. I'm stuck inside this computer.
Simulated Sex should be our next challenge... (Score:5, Insightful)
Re:Simulated Sex should be our next challenge... (Score:2)
Humanoid, preferably. Female would be nice, too.
Given my chances with human females, that's probably the only hope that my genes have for the future.
Re:Simulated Sex should be our next challenge... (Score:2)
Do you mean a simulation other than your left hand?
A Slashdot Dupe Checker (Score:5, Funny)
More distributed computing (Score:2)
More programs like at distributed.net. Also cancer reseach, mapping the human genes, and SETI.
I see more distributed software technologies. Microsoft itself wanted to try "download and run" schemes, where you purchase a piece of software and then download some code chunk that allows you to run the program for only a single session.
In gaming Bit Torrent is a popular medium for patching games and Steam is certainly going t
Re:More distributed computing (Score:2)
Too bad they're impossible (Score:5, Insightful)
Re:Too bad they're impossible (Score:2)
(Whether you meant 'IT' beyond the scope of the article I don't know, but suggest that you read it...)
Re:Too bad they're impossible (Score:2)
Re:Too bad they're impossible (Score:3, Insightful)
Web applications (Score:3, Insightful)
This is where technology like Macromedia Flex [macromedia.com] comes in. I've seen this stuff in action, and the process of creating complex applications is so easy it's unbelievable. A field of sortable and stretchable columns can be generated with about three lines of code, and the data that goes into it can come from any application server you like.
Sure, anything that uses the Flash player gets a hammering on Slashdot, but I sense that times are a changing around here and more people are starting to wake up to the potential of this stuff, even if it goes a little against the open source ethos of the place.
BTW, if you're a member of the "Flash sucks and I hate it because some people used to abuse it by making annoying animations with it" brigade, see my journal where I've already refuted your half-baked criticisms.
Re:Web applications (Score:3, Interesting)
Re:Web applications (Score:3, Insightful)
In any case SVG doesn't have half the abilities of Flash and it definitely doesn't have anywhere near the same lev
Here's a challenge... (Score:5, Funny)
Making Firefox on Linux as quick as Firefox on Windows...
Idea for Linguistic Intermediate Language (Score:3, Interesting)
Let's say there's a chatroom with a guy from Poland, a girl from Japan, and a duck (this is not a serious example, obviously, and why they are in this chatroom is left to the user's imagination). The duck sends his message, and it gets scrambled into the intermediate language. This language can now be translated directly into any local dialect, without having to translate the message for each seperate language being used, or without the user having the know the language. Just imagine - a user from Russia chatting with a user from Mexico, and neither knowing the other is anything but their native tongue. Of course it's not meant to be a cultural mask or anything - certain language / cultural barriers would of course be present, but at least this is better than having to run to Babelfish every few seconds.
Re:Idea for Linguistic Intermediate Language (Score:3, Interesting)
Verifying compiler? Correctness proving tools? (Score:4, Insightful)
Verifying compiler? Correctness proving tools? Two words - Halting Problem [wikipedia.org].
Re:Verifying compiler? Correctness proving tools? (Score:2)
Re:Verifying compiler? Correctness proving tools? (Score:3, Insightful)
Maybe you'd care to expand on those two words to explain why you don't think that there are classes of computational processes for which classes of specification can be proven as met, or why you don't think this is useful...
There most certainly are such classes and classes, but the proving cannot be automated (except for non turing-complete languages). A computer can verify that a "proof" is indeed a proof, but it cannot produce such a proof itself.
Perhaps if every binary came along with a proof of its
Re:Verifying compiler? Correctness proving tools? (Score:3, Insightful)
This is all well and nice, but the halting problem is just one small example of an undecideable problem. In fact, every nontrivial, semantic problem about computer programs is undecideable ("semantic" means that the answer only depends on what the program does as opposed to depending on the program itself. "nontrivial" means that the answer isn't the same for all queries).
This narrows the set of decideable problems to ones that are either:
Challenge Calling (Score:2)
They missed one of the list.. (Score:2)
Solve the spam problem (Score:5, Insightful)
"grand challenges" from the 1950s (Score:2)
- Automated language translation.
- Self-programming computers.
- Natural language understanding and interfaces.
- Image understanding.
These has migrated in and out of artificial intelligence over the decades.
but thats unpossible! (Score:2)
That sounds a lot like the problem at the heart of the Church-Turing thesis [wolfram.com], the so-called halting problem. And that one was shown to be impossible [well, "undecidable" was their exact word] because it can be mapped to Goedel's incompleteness [wolfram.com] result...What did I miss? This will be hard even for trivial programs, let alone any you would want to run.
Besides, this correctness mirage has
pass the "Seinfeld" test (Score:2)
The worthy challenge (Score:2, Insightful)
Something really really hard (Score:2)
The transistor (Score:2)
They need to learn basic compsci (Score:3, Informative)
This, admittedly was in the summary text in the magazine, not the article by the scientists themselves, so it could be a case of "idiot summarizing it wrong", but there just is NO WAY to do what they are talking about. No how, no way.
To prove a program correct requires that you run it in a test environment. If you run it, and it is not correct, you get the same problem in your test run that occurs in the real run. Therefore you cannot test for a program's correctness automatically in a compiler. For example, any program trying to detect if a loop is infinite will itself end up looping infinitely when it encounters one and tries to check it.
Re:They need to learn basic compsci (Score:3, Informative)
Wrong. Machine-verified proof of correctness is quite feasible. We did it twenty years ago. [psu.edu] The DEC SRL people did a nice proof of correctness system for Java in the 1990s, before Carly shut down DEC research. It's hard to build such systems, but not impossible. The theory is well understood now, which wasn't true when we did it.
It's not that hard to prove loop termination. You must define some measure which, for each iteration of the loop, decreases.
Re:Actually... (Score:3, Informative)
Of course, if the definitions are wrong, all bets are off, but it's still an incredibly useful thing to have.
For another example, think of software test suites. Nowadays, you have a programmer explicitly defines a score of situations and checks to make sure that these situations fit defined requirements. With a verif
Re:Teleportation (Score:2, Informative)
This distinction is important because we will learn to telecopy objects and telecopy live organisms before we learn to teleport them.
Re:Teleportation (Score:3, Funny)
Helloooooo, lawsuit.
-.+AA
Re:Teleportation (Score:2)
I can do destruction, so as soon as someone gets the telecopy process done, I'm good to go... or does this count as placing something into the public record and prevent me from patenting it? Hmm... better rethink this post....
Re:How about (Score:2)
Re:How about (Score:2)
If not, I may as well add: Time Travel!
I got it! (Score:2)
Of course, we would also need a parser to determine if we wrote the specs correctly, possibly by using a third language that defines the way that specs are transformed from human language to machine-readable language.
Brute force AI timeline (Score:2, Insightful)
The exact computation required to simulate a neuron sufficiently accurately is not known exactly, but we can put some reasonable estimates to it. I use 1 synapse firing = 1 bit +-
Re:Brute force AI timeline (Score:5, Insightful)
As for thought itself, I seriously doubt it works in the same way that a hardware simulation that you are describing would work. Think about how much energy would be required and how much heat would be generated compared to a human brain. Biology simply doesn't work in that way. Look at protein folding. It's extremely computationally intensive to determine the way a protein will fold, but biologically the process of folding is relatively simple. It's the same with thought. If we could figure out how the brain works, then we could probably simulate it with hardware that we could make now.
Re:Brute force AI timeline (Score:3, Interesting)
(That is, a good enough atomic-level brain/body simulation would still respond "don't remind me" when asked about it's last birthday, just like the human being being simulated.)
Whether anybody was home would be one for the philos
Re:Brute force AI timeline (Score:3, Interesting)
I think it *is* software based. We don't really all of the mechanisms that cause synapse formation or alteration, and I saw some research last year that suggested that synapses and neurons may not be the entire answer to the brains computational power. There were some cells thought to be support cells that have shown indications of communication with eachother and neurons.
When someone says brute force, I take it to mean a simulation at n
Re:Brute force AI timeline (Score:3, Insightful)
Re:Brute force AI timeline (Score:4, Insightful)
Re:How about an OS as good as VM/370 with a GUI? (Score:4, Insightful)