AI Has Cracked a Key Mathematical Puzzle For Understanding Our World (technologyreview.com) 97
An anonymous reader shares a report: Unless you're a physicist or an engineer, there really isn't much reason for you to know about partial differential equations. I know. After years of poring over them in undergrad while studying mechanical engineering, I've never used them since in the real world. But partial differential equations, or PDEs, are also kind of magical. They're a category of math equations that are really good at describing change over space and time, and thus very handy for describing the physical phenomena in our universe. They can be used to model everything from planetary orbits to plate tectonics to the air turbulence that disturbs a flight, which in turn allows us to do practical things like predict seismic activity and design safe planes. The catch is PDEs are notoriously hard to solve. And here, the meaning of "solve" is perhaps best illustrated by an example. Say you are trying to simulate air turbulence to test a new plane design. There is a known PDE called Navier-Stokes that is used to describe the motion of any fluid. "Solving" Navier-Stokes allows you to take a snapshot of the air's motion (a.k.a. wind conditions) at any point in time and model how it will continue to move, or how it was moving before.
These calculations are highly complex and computationally intensive, which is why disciplines that use a lot of PDEs often rely on supercomputers to do the math. It's also why the AI field has taken a special interest in these equations. If we could use deep learning to speed up the process of solving them, it could do a whole lot of good for scientific inquiry and engineering. Now researchers at Caltech have introduced a new deep-learning technique for solving PDEs that is dramatically more accurate than deep-learning methods developed previously. It's also much more generalizable, capable of solving entire families of PDEs -- such as the Navier-Stokes equation for any type of fluid -- without needing retraining. Finally, it is 1,000 times faster than traditional mathematical formulas, which would ease our reliance on supercomputers and increase our computational capacity to model even bigger problems. That's right. Bring it on.
These calculations are highly complex and computationally intensive, which is why disciplines that use a lot of PDEs often rely on supercomputers to do the math. It's also why the AI field has taken a special interest in these equations. If we could use deep learning to speed up the process of solving them, it could do a whole lot of good for scientific inquiry and engineering. Now researchers at Caltech have introduced a new deep-learning technique for solving PDEs that is dramatically more accurate than deep-learning methods developed previously. It's also much more generalizable, capable of solving entire families of PDEs -- such as the Navier-Stokes equation for any type of fluid -- without needing retraining. Finally, it is 1,000 times faster than traditional mathematical formulas, which would ease our reliance on supercomputers and increase our computational capacity to model even bigger problems. That's right. Bring it on.
The answer: (Score:4, Funny)
42.
Re:The answer: (Score:4, Funny)
42.
Uh, its :
f(x,y)=dy/dx(F(x))+ dy/dx(F(y))=42
I think. I haven't had to use diffy-Qs in - ever. It was a weed out class.
Re: The answer: (Score:1)
Re: (Score:1)
As x-> lim(42), x ->n-> ~=42
Why so limiting? Throw in imaginary numbers too.
42i
LOL.
I know, I probably poked a bear with that one.
Re: (Score:3)
I think your notation is a bit off from what you indented (the formula for divergence), "dy/dx" should be "d/dx" and "d/dy"
i.e. The meaning of the universe is its own constant expansion.
Re: (Score:2)
Re: (Score:2)
Base, 2, non-standard coding. This really is just a fancy "1" in standard coding and the leading zero is entirely redundant.
Re: (Score:1)
2+2 = 42
for certain values of "="
(thanks to Terry Pratchett)
Re:Education Philosophy (Score:5, Informative)
Re: Education Philosophy (Score:1)
Re: (Score:3)
You have the sim and modeling group. Those guys have to eat this stuff all day long in their FEA modelers.
Literally five people out of 1000 engineers. If they get swamped, they outsource it. Often overseas.
Re: Education Philosophy (Score:5, Interesting)
Literally five people out of 1000 engineers.
Not even that. These five guys are just punching their numbers into an FEA software package. They aren't sitting down and solving PDEs.
99% of engineers will never, not once, need to solve a PDE in the real world. Using software that implements PDEs in its backend algorithms doesn't count, since that requires zero knowledge of PDEs by the user.
Of the math I learned:
Trigonometry - Very useful. I use it all the time.
Linear algebra - Very useful for graphics and physical simulation.
Statistics - Needed for simulation and AI
Calculus - I actually had to do a closed-form integration, like, twice, in my entire career.
PDEs - Never, not once, since I left school.
Re: Education Philosophy (Score:4, Insightful)
99% of engineers will never, not once, need to solve a PDE in the real world. Using software that implements PDEs in its backend algorithms doesn't count, since that requires zero knowledge of PDEs by the user.
Applying simulation software without at least some knowledge of the underlying math is a road to failure. My electronic circuit simulation software can barf on certain circuit conditions, but I only know this because I already have some idea of the right answer, usually by good old pencil and paper methods. I should say that my circuit simulations are mostly not PDEs, but non-linear time dependent systems. I have used PDEs in magnetic system analysis, and radio antenna designs.
What worries me about the idea of an AI system being applied to simulation is that the AI, or machine learning, might optimize the wrong thing, and the user would have no way of knowing this had happened. This relates to the so-called racist facial recognition algorithms, that may be picking up on the wrong features is facial images.
Re: (Score:2)
It's a staple in electrical engineering as well, e.g. device physics and electromagnetics. I doubt there is a modern gadget or structure today that wasn't designed in part using CA
Re: (Score:3, Insightful)
I doubt there is a modern gadget or structure today that wasn't designed in part using CAD software that relied on the solution of a system of PDEs.
It's unfortunate that today's CAE software can't do the PDEs for those engineers. They have to do all of those PDEs by hand. :P
Outside of FEA software development, and a few advanced users of those tools, there isn't much demand for those skills. Statistical skills are in higher demand. Prove me wrong.
Re: (Score:2)
It's unfortunate that today's CAE software can't do the PDEs for those engineers. They have to do all of those PDEs by hand. :P
You do realize that the article is talking about software solving PDEs. It's not saying they're notoriously hard for people to manually solve, it's saying it's highly computationally intensive.
The idea is to replace CADs that solve the PDEs through numerical integration with CADs that solve the the PDEs with neural networks. So if you're using CAD software that solves PDEs, you're using PDEs in the real world.
Re:Education Philosophy (Score:5, Insightful)
My point is that most math taught in schools is heavy on calculus and light on statistics.
Of course CAD software is ubiquitous and uses PDEs. However, the only people that need to know those PDEs are ones doing groundbreaking research or developing CAD software. Most engineers will never do the math after graduation.
Engineers with strong knowledge of statistics are hard to find. Believe it or not, AI is typically classified as an area of statistics. [princeton.edu] TFA is about using statistical tools to accomplish what was traditionally done with calculus.
Having such a strong focus on calculus is detrimental to our technological development. It forces us to look at problems in one particular way, when there may be better statistical approaches to solving those problems.
Re: (Score:3)
Most people, even if they use math in their work, do not actually do statistics, except for cranking standard formulae to get the answers. I did some statistical theory in final year math at uni. It is not easy to work out where formulae such as Gaussian or Rice distributions come from. And there are of course many statistical fallacies that trip people up. This may be beyond my usual engineering field, but I believe the 2008 financial crisis was in part due to hiding risky lending behind complex financial
Re: (Score:2)
AI Would be closer to an area of Probability theory rather "statistics". Statistics is for bean counters, probability theory is for actually understanding and modeling stochastic events.
Re: (Score:2)
Re: (Score:2)
The use of statistics in engineering is mostly limited to optimizing reliability and yield versus other considerations (mostly cost), and that's only meaningful in high volume production environments.
I agree. I've spent my career in high volume production, and those topics tend to dominate though.
Low volume engineering, like building design, needs a totally different skill set than high volume. It makes me wonder if we have classified our engineering disciplines incorrectly, and we should focus more on production volume than application.
Re: (Score:2)
An anecdote won't really prove anything, but I do have experience using multiple FEA tools. I would argue that anyone using that software without a solid understanding of the math behind it is at serious risk of misinterpreting the solutions. There are some easily identifiable failures of these solvers, but there are also quite a few that can give you the wrong answer with no detectable indications unless you comprehend what's going on behind the scenes.
So I think there's definitely a need to teach PDEs a
Re: (Score:2)
I think you've put your finger on the problem with education. It's not that it's not satisfying industry's demands. It's that the drones it turns out think that's all there is to life.
I wouldn't trust an "engineer" who knew how to punch a finite element model into a computer but had no idea about what was going on underneath any more than I'd trust a computer modeller who didn't know about the limitations inherent in floating point calculations.
Re: (Score:2)
Consider how expensive it's been to use it. That has meant that PDEs would be used only where they are really needed, and elsewhere you'd use something else. Reduce the cost of using by a factor of 100 though, and over time you'll see it spread into many additional areas. (Just consider what CPU chips did to computer use. But also remember that it took decades for that to really spread...in fact, the spread is still happening.)
Re: (Score:2)
Likely. I am a PhD level CS type and I had almost no training in differential equations and never needed it in the first place. If everything is digital, the need rarely crops up. However, unlike the author of this story, I am very much aware that once you deal with the physical world, that changes dramatically.
Re:Education Philosophy (Score:4, Insightful)
This post is what happens when one tries to regurgitate someone else's statements without understanding them. Statistics over calculus? Depends on the industry but differential equations are not going anywhere. AI as an extension of statistics? I would argue the exact opposite. AI is fundamentally about finding order and rules, about self learning. Statistics is about analysis of noise, where we have abdicated learning the details and just look for anything we can extract from randomness or differentiate from randomness. Certainly learning and statistics are related (Shannon entropy and Fisher entropy are indeed mathematically connected) but at the deep philosophical level neither is the extension of the other.
Lastly, if you think you can understand statistics without calculus then you need to study this a wee bit more.
Re:Education Philosophy (Score:4, Insightful)
AI as an extension of statistics? I would argue the exact opposite. AI is fundamentally about finding order and rules, about self learning. Statistics is about analysis of noise, where we have abdicated learning the details and just look for anything we can extract from randomness or differentiate from randomness.
I can't tell if you don't understand statistics, or if you don't understand AI. They are both looking for patterns (order and rules) in data that has noise. AI just looks for much more complex patterns in data that has much more noise. It's an advanced area of statistics.
Re: (Score:2)
Let's try this again, now extra slow. AI research is a study of epistemology. Statistics is a study of randomness and random processes. They are not immediately related and one is not an extension of another.
Much of physics (thermal processes, quantum mechanics etc) has an essential random component and much of what we call statistics today came from physics research. This does not make physics and extension of statistics or vice versa.
And again, statistics without differential equations and without calculu
Re: (Score:2)
Let's try this again, now extra slow. AI research is a study of epistemology. Statistics is a study of randomness and random processes. They are not immediately related and one is not an extension of another.
Both are studies of epistemology. That's why hypothesis testing is an area within statistics.
Much of physics (thermal processes, quantum mechanics etc) has an essential random component and much of what we call statistics today came from physics research. This does not make physics and extension of statistics or vice versa.
Much of calculus came from physics research as well. What's your point?
And again, statistics without differential equations and without calculus is not even conceivable. All likelihood methods (MLE for instance) are inherently calculus based.
Most university statistics courses don't have calculus as a prerequisite. You don't need calculus to understand much of statistics. However, as you point out, there are a few exceptions.
Even the AI methods have to use calculus based statistics and exact solutions to differential equations to check that they work properly.
That was to obtain ground truth. [wikipedia.org] You can theoretically substitute measurements of a physical system to achieve the same goal. (this would be a good topic
Re: (Score:1)
Re: (Score:2)
The only similar requirements I had were to take a "humanity". It didn't matter which one it was. I took economics and psychology, and found both to be rewarding.
Re: (Score:2)
Um, you mean the modern culture that gave us global warming, environmental disasters, the ability to think science is some sort of dodge. Gee, why would engineers want to screw with how the causes of those problems affect their systems?
Re: (Score:2)
Greetings, fellow CSU or UC graduate.
Re:Education Philosophy (Score:4, Insightful)
>Both are studies of epistemology. That's why hypothesis testing is an area within statistics.
>Much of calculus came from physics research as well. What's your point?
My point is that areas of research with overlapping competencies are not necessarily identical.
>Most university statistics courses don't have calculus as a prerequisite. You don't need calculus to understand much of statistics. However, as you point out, there are a few exceptions.
Aside from some very introductory classes, that is flatly impossible. You cannot even properly define the area under the curve (and thus discuss the confidence intervals in terms of sigmas for Gaussian curves/pdfs) without calculus. Courses that do not have calc as a prerequisite tend to do a curt intro to calculus themselves.
>That was to obtain ground truth. You can theoretically substitute measurements of a physical system to achieve the same goal. (this would be a good topic for a research paper BTW)
No you cannot, unless you have some fundamental reason to be able to place bounds on your experimental errors. So this tends to work for some cases but then fails whenever errors are hard to control (like near phase transition boundaries). There is no shortage of research on fluctuations in physics but yes there is always room for one more paper if you have an original angle.
>However, there isn't enough room in the curriculum to add more classes. Something has to be removed. I think the calculus curriculum needs to be paired down.
No, we need to separate STEM and humanities paths earlier and prepare people for these different paths in a more focused way. Calculus should be moved into tech high school curriculum. In the USA, I would make knowledge of English, fluency in vector calculus and ability to grok some turing complete instruction set as prerequisites for getting a tech high school diploma. Much like exposure to history and Shakespeare should be required for humanities diploma.
Re: (Score:2)
No, we need to separate STEM and humanities paths earlier and prepare people for these different paths in a more focused way.
I do not agree about this separation.
I came to realize when I studied engineering at uni (many years ago), that my written English was inadequate for even the simple job of writing a few pages for a lab report. There wasn't any "English for Engineers" course, so I just did the best I could by teaching myself. Recently, I have got more involved in philosophy and politics. I feel the lack of a sound humanities background.
I do not know why this is, but there is a divide in what people want to study, from early
Re: (Score:2)
I mostly agree. I never said humanities were unimportant. Techies can employ humanities majors to communicate and do politics and humanities majors can employ techies to do do techie stuff and not meddle in communication and politics.
You also have people capable of bridging these worlds. Much like you can graduate from college with multiple majors, you should be able to graduate high school with multiple diplomas if you are able and willing to work a bit harder than most.
Re:Education Philosophy (Score:5, Informative)
PDEs are a major backbone in industry. Fluid dynamics is used from bridges to cars to medical injection systems to airplanes. I work on a liquid phase chromatography simulator that is used in industry and that is also a bunch of PDEs, ODEs and algebraic coupled equations.
A lot of the work on finite element with things like discontinuous galerkin, stabilized spacetime, advanced finite volume techniques etc are developed to help with industry applications. They do a lot of the funding for this work.
Re: (Score:2)
By contrast, we have 5 people in the analysis group doing PDEs. If they have too much work to do, it's cheap to outsource that overseas.
Re: (Score:2)
For what I do we normally break things down into frequentist statistics and Bayesian statistics. Overall, I prefer the Bayesian approach and we use it for all the error modeling with tools like MCMC. I find that Bayesian methods do a much better job of handling uncertainty with highly non-linear and coupled models.
Frequentist statistics tends to be easier and much less computationally intensive, but it gives results which don't end up matching experiments very well.
Re: (Score:2)
I am going to have to disagree with this. Modeling of any system rather it is kinematics, thermal, electrical, fluid, etc requires the use and ground-level understanding of differential equations - no statistics in sight. I guess everyone's experiences is different.
Re: (Score:2)
I am going to have to disagree with this. Modeling of any system rather it is kinematics, thermal, electrical, fluid, etc requires the use and ground-level understanding of differential equations - no statistics in sight. I guess everyone's experiences is different.
TFA just said the opposite. They did it better with AI, and AI is just advanced statistics.
Re: (Score:2)
A.I. looks like an extension of that 1990s fraud known as fuzzy logic. To their credit, A.I. developers seem to have withdrawn from that silliness.
A.I. appears to work to give the results that knowing underlying mechanisms would give, without providing knowledge of those mechanisms. If A.I. is just statistics, does generally statistics work better than model fitting?
Computational not mathematical (Score:5, Informative)
Personally, I consider this a computational innovation, not a mathematical one. The authors' main innovation is training and running their neural network in the Fourier-transformed (frequency) domain, giving them much better results than previous attempts at using neural networks for fluid dynamics PDEs.
Overall, if I understand correctly, the way this category of schemes tends to work is that the neural network learns more or less "what fluids behave like". It is then given some set of initial conditions and then generates what it "thinks" happens next.
Traditional PDE solvers (which I worked with professionally for decades) can do the same, basically by doing repeated arithmetic computations simulating the physics within lots of little cells (or finite elements). This generates a solution that is "exact" and repeatable within the physical assumptions and discretization techniques involved. But...that's usually way more information than a researcher really needs.
For example, someone looking at turbulence around an airframe doesn't really care about the exact shape of any given simulated vortex, but rather about where vortices tend to form. So much of that detail from the PDE solver is ignored and wasteful.
These neural networks are asked to get the gestalt right, much more quickly. They are therefore useful even though no particular vortex position they happen to have "guessed" will correspond precisely to the exact simulated physics.
Here is the paper: https://arxiv.org/pdf/2010.088... [arxiv.org]
Re: (Score:2)
Re: (Score:2)
It seems unlikely, but if they can weed out the runs that will later be discarded, they can still save a great deal of computation.
Re: (Score:2)
The authors' main innovation is training and running their neural network in the Fourier-transformed (frequency) domain, giving them much better results than previous attempts at using neural networks for fluid dynamics PDEs.
It reminds me of how our retinas translate vision into the log-polar domain prior to feeding it to our brain's neural networks. The log-polar domain has (1) better data compression, (2) increases the size range of objects that can be tracked using a simple translational model.
http://users.isr.ist.utl.pt/~a... [ist.utl.pt]
Re: (Score:2)
Personally, I consider this a computational innovation, not a mathematical one.
Not only you. This clearly is numerical approximation, done in a somewhat novel way, nothing else.
Re:Computational not mathematical (Score:5, Informative)
This is something else, more comparable to LES (Large Eddy Simulation) models, which try to capture the essential statistics of turbulence so that a broader resolution simulation (less computationally expensive) looks similar to the real deal, although we know since the work of Edward Lorenz that the solution will became exponentially wrong the longer we integrate, and the general statistics of the solution might be OK or might be complete bullshit.
LES models are situation dependent. Models that work well in a certain setup (for example, isotropic turbulence) can lead to nosense in other contexts. I don't think it is very clearly stated in the article, but I think this is the same thing. You train a NN for certain examples for which you know the "exact" solution and use it to kind of interpolate in that region of parameter space, developing a turbulence model that you don't have the slightest physical idea of what is assuming, but which works ok.
Even more, I don't really care for the example they use to illustrate the paper. They use it for the 2D equation with an initial condition that resembles a vorticity dipole. Two things:
a) The 2d equations are MUCH easier to deal with than the 3D equations. 2d flow is much easier to organize and displays considerably less chaotic behavior. A direct consequence of this is that "we" (as if I ever could) have proven the equivalent of the Millenium Navier-Stokes problem for the 2D case. However for 3D case there isn't much going on the last couple of years.
b) Somewhat associated with a), freely evolving isotropic 2D flows TEND to form a vorticity dipole, due to a process called inverse enstrophy cascade. When you are at that stage it is extremely easier to "guess" the dynamics, as you have most of your energy contained in the largest scales (which require geometrically less degrees of freedom to represent).
I like the idea in the paper. My PhD is basically trying to use Fourier for turbulence simulations in situations where it isn't specially suited for. Everywhere you add Fourier I'll be happy, but the report seems to me like incredibly overhyped (being the AI field I'm not surprised). The paper is ok, although they could be a little more honest about shortcomings and the fact they are considering the easiest examples of them all.
Re: (Score:2)
Interesting. So this is very much another case where an ANN can be completely off. Does not surprise me, interpolation of statistics (the training data is always a statistical sample of something) can be exceptionally far off because they have no concept of hard state changes at some places.
As a CS PhD, the overhyping the AI field does deeply offends me. They are constantly trying to sell something they cannot deliver. As such, they are competing with other CS research in a completely dishonest and dishonor
Re: Computational not mathematical (Score:1)
Re: (Score:2)
These neural networks are asked to get the gestalt right, much more quickly. They are therefore useful even though no particular vortex position they happen to have "guessed" will correspond precisely to the exact simulated physics.
That is all very well, but how do you know the AI got the right answer? I guess you would have to run a slow exact physics simulation as a check. There does seem to be a risk with pattern recognition algorithms that they do silly things, like mistaking a cat for an elephant.
I am a bit out of my area of expertise again, but I gather that one of the basic problems with the study of turbulent flow is that you cannot make the grid size small enough to be confident of an accurate simulation. The smaller the grid
Re: (Score:1)
Re: (Score:2)
Accurately Named. (Score:1)
"Artificial" means "not real." So, "Artificial Intelligence" means "not real intelligence." And that is exactly what these deep learning techniques produce: algorithms that behave almost as if intelligent, even though they aren't. So, since they aren't actually intelligent, using a word like "artificial" which means "not real" is a great way to describe them.
But you have been told this before, many times. You have a semantic hang-up that is not shared by the rest of the world, and certainly not by the c
Re: (Score:1)
You are wrong on this. The Oxford dictionary says for "artificial": "made or produced by human beings rather than occurring naturally, especially as a copy of something natural."
Hence "artificial" refers to the way something was made, not to what it is or can do. For example, an artificial hip is still very much a hip, or artificial sweetener is still very much sweet. Artificial intelligence, however, is not intelligence by any sane definition of that term. It is just dumb, insight-less automation and hence
No, you are wrong on this. (Score:2)
I can quote the dictionary [merriam-webster.com] too:
3b: Immitation, sham
So the word has more than one overlapping meaning, this one in particular being precisely consistent with my point. And as for the definition of artificial intelligence [merriam-webster.com]:
1 : a branch of computer science dealing with the simulation of intelligent behavior in computers
2 : the capability of a machine to imitate intelligent human behavior
Something simulates a thing when it is not that thing. Something imitates a thing when it is not that thing. I could provid
Re: (Score:2)
That is not your original chain of reasoning.
Re: (Score:2)
Re: (Score:2)
It is. The deception relies both of the wish many people have for slaves that serve them and at the same time their fear that those slaves will raise up. Computers will do neither. They are tools and have no agency.
Re: (Score:2)
Re: (Score:2)
That is a keen observation. I completely agree.
Re: (Score:2)
Re: (Score:2)
Pretty much so. The term "intelligence" gets so badly abused these days that actual intelligence is now often called "general intelligence". Of course, that is just the same thing as the original definition. There is no "special intelligence", that term makes no sense. "Special intelligence" is called automation and it means doing tasks without a clue how or why anything works. And that is all "AI" can do.
I'm skeptical (Score:1)
All the same, it may be possible for AI methods to discover useful low order approximations to fluid mechanics solutions. But whether you can then use them for harder problems is kind of a squishy question that
Not really AI (Score:3, Interesting)
It's a bit misleading to call this category of approximately "AI", or even learning. A neural net, particularly a deep net, can be viewed as a universal approximator, in this case mapping from a high dimensional space of the local state of Navier Stokes equations to the next predicted state. The mapping is an even higher dimensional, continuous surface, which is approximated by iteratively fitting neural net weights to it, using standard Navier Stokes numerical methods as a guide. In other words, exactly the same mathematical tools as for AI, but it's not AI. It is function approximation, there is nothing intelligent about it. Well, the intelligence of the researchers, they are bloody intelligent. Who else even has a clue what's going on here?
To bad everybody quits math just before they get to partial differentials, that's where it gets fun.
Re: (Score:2)
A neural net, particularly a deep net, can be viewed as a universal approximator, in this case mapping from a high dimensional space of the local state of Navier Stokes equations to the next predicted state.
Also known as a PDE integrator, then?
Re: (Score:2)
Sounds good to me.
Re: (Score:2)
If I was to take a run at this I would first look to finite element and/or fourier transforms as relatively efficient error functions for the fit.
Re: (Score:3)
It is function approximation, there is nothing intelligent about it.
That's really what AI is. Deep learning is just more complex functions.
Re: (Score:2)
Well, if you mean to say that AI is actually a bunch of cheap tricks, then agreed.
Deep learning is nothing more than the process of training a deep network, and a deep network is nothing more than a neural net with internal layers. But it sounds so much more _intelligent_ when you say deep.
That is what "artificial" means. (Score:2, Troll)
A lot of people seem to have trouble with this simple semantic issue. "Artificial" means "fake." So, if something is "Artificially Intelligent" that means that it is NOT intelligent. It is a non-intelligent algorithm doing something that seems intelligent, even though it's not.
In computer science, the phrase "Artificial Intelligence" refers to a broad set of algorithms that are used to do this kind of imitation of intelligent behavior (without actually being intelligent behavior). Something like "a mach
Re: (Score:3)
'"Artificial" means "fake."'
It can, but it doesn't here. Here it is used in its original meaning, made intentionally, by artifice, as opposed to something that just naturally grew. Words do indeed have meanings. You should learn what they are before basing arguments around them.
Re: (Score:2)
Re: (Score:3)
At some point in the last decade or so "machine learning" (which really should be called "machine training") got renamed "AI".
Really, no. It's been called AI since at least the 60's, arguably much earlier (can you spell Turing?)
Re: (Score:2)
Sounds to me like you have not heard of the Turing test.
Re: (Score:2)
Re: (Score:1)
That there is nothing intelligent about it is exactly right. And I agree, this is not even the misnamed "AI" everybody without a clue is so hyped about.
I found the one course I had to take that had this subject in it very boring. I did quite a bit more math though, but discrete stuff, up to and including some abstract algebra and quite a bit of formal logic, mostly non-classical. Now that stuff is interesting!
most of the real work was in Fourier space (Score:2)
Glad my high-school had calculus (Score:2)
My first year courses for engineering had a special math class. Most classes were 3 hours a week, either 1 hr MWF or 1.5 hrs TH; this one was somehow four hours a week, plus two tutorials. It was basically to cram a full course of differential and integral calculus into the first semester, so that you could comprehend the second. We got it by the end of January: we couldn't have followed the chemistry course, the physics course, or the optics course without differentials and integrals. They would ha
Re: (Score:2)
Re: (Score:2)
So did mine. And my doing rather well in H.S calculus really helped give me a head start on my college engineering curriculum. But that might be due to the fact that my H.S. calculus teacher was also the basketball coach.
Ai is just a computer (Score:1)
I feel like crediting Texas Instruments for manufacturing the calculator used to solve quite some world problems and earn Nobel prizes.
So... basically a FEA using a NN? (Score:2)
I actually read the article, and it appears they used NN training to transform a network of interconnected nodes into a network of finite elements, and ran the simulation. I would be surprised, though, if the authors realized they were transforming a NN into a FEA machine.
Approximation not solution. (Score:1)
Re: (Score:2)
To be fair, most actual "solving" does involve approximations for the classical approach as well. As far as I remember, a closed-form solution does not actually exists in most cases. I might be wrong, it was 30 years ago and I never needed them.
That said, the utterly stupid claim that AI has "cracked" anything is completely wrong, as usual. It has no insight, method or understanding here. It can just be trained very well on data and then can interpolate. Incidentally, that is all it can do. Because the prob
Really? (Score:2)
Unless you're a physicist or an engineer, there really isn't much reason for you to know about partial differential equations
Tell it to mathematicians, finance folks, biologists, ...
Re: (Score:1)
No, it has not (Score:2)
It just is able to be trained on data, as usual. No insights, understanding, methods involved.
Also, note that these problems usually get solved using approximations, and Artificial Stupididty is good at that. In fact, it is the only thing it can really do.
SIGGraph paper presentations (Score:2)
I always groaned while attending a SIGGraph paper presentation on fluid simulations when the author said "We implemented the Navier-Stokes equations..." Yeah, sure you did. Show me the source code.
The Language God Talks (Score:2)
The language begins with the Calculus, and quickly moves to Differential Equations and from there into Quantum Mechani