Potential Effects of Recent AI Developments on Physics?

  • #1
Ege Artan
32
8
Hi, I am a physics-mathematics double major and will most likely be going into theoretical high energy physics or mathematical physics for my PhD.

Considering the recent developments in AI, and how it is predicted to cause drastic changes in a lot of fields and job markets, do you think there will be any significant if at all effects of the recent exponential developments in AI, especially in theoretical high energy physics or mathematical physics?

I want to know if what I want to pursuit will be affected by these developments, since I am most likely be going into this field, I wanted to know if there is any skills I could develop or might consider to develop to keep up the competition and pursue my dream job with mitigated damage?

Also, how does anyone think AI might affect this field specifically? Will there be stuff of use for theoretical high energy physicists or mathematical physics?

Would appreciate if stuff said would be supported by some hard evidence via articles and stuff like World Economic Forum discussions. I am specifically asking for more backed claims because since this is people's jobs we are talking about, wishful thinking becomes very prevalent and some truth might be obstructed. We are seeing this effect of wishful thinking in fields like psychology and game development.

Thank you.
 
  • Like
Likes binis
Physics news on Phys.org
  • #2
I have no future predictions for you. All I can say is that current LLM’s cannot do physics.
 
  • Like
Likes Astronuc and russ_watters
  • #4
gleem said:
Take a look at this paper

AI and Theoretical Particle Physics https://arxiv.org/abs/2205.05803

or this presentation

https://indico.nikhef.nl/event/4875...chments/8268/11791/EuCAIFcon2024-Schwartz.pdf
I had a look at the paper, couldn't read all of it since I am in my sophomore I did not know about 50-60% of the terms they were mentioning. Also have fully read the presentation.

From what I've learned from these sources, I think AI is going to play a giant part at analyzing existing conditions to make something useful out of them. I don't know if AI is as creative as humans since the arguments they gave for it sound to me that by definition AI will be seen as more creative but I am completely agreeing with all the considerations for data analysis and making conclusions about data sets.

To be honest, the stuff the presentation talked about picked my interest and I don't have the same perspective on AI in Physics anymore, I might pursue it in my future career instead of looking at it like a must have, obliged, boring thing. Like really, the potentials seem to be incredible and it seems like it could really reduce the hard-work done when coming up with a theory, trying to fill in the necessary details.

I will be pursuing this during my undergrad. have looked into IAIFI a little bit, will get involved into their events and try to see what I can learn further. Do you have any recommendations to improve myself on this intersection of fields? I am thinking of getting back to learning Julia and R since I presume they'll be crucial in the actual process of applying these ML algorithms. Are there any other institutions than IAIFI that you know of that have internship opportunities, seminars, and workshops?
 
  • #5
Since you are still very early on in your education it may take some time to appreciate how to use AI. Before you jump into it with both feet perhaps it would be best to get a feel for its use by perusing the latest articles in relevant journals that are using AI. If you are lucky maybe a group at your university is or will be using AI.
 
  • Like
Likes Dale
  • #6
AI can write plausible sounding theory paper titles and abstracts already: see the "SnArXiv:,
 
  • Like
Likes russ_watters
  • #7
Dale said:
I have no future predictions for you. All I can say is that current LLM’s cannot do physics.
Vanadium 50 said:
AI can write plausible sounding theory paper titles and abstracts already: see the "SnArXiv:,
I think it's worth expanding and reminding others that LLMs are basically an autofill on steroids, and they do not fit most peoples' vision of what AI is/does. Regardless of what the companies (marketteers?) call them. They are designed specifically for the purpose of writing plausible sounding communications, with a pleasant and accommodating tone.

We've had a number of people here claim to "develop a theory with the aid of AI" and what they produce is basically technobabble: plausible sounding gibberish. The LLM is not "thinking". It is not doing physics.

What's scary about it to me is so many people who use it seem to misunderstand it and use it for advice/decision making.
 
  • Like
Likes Astronuc, nsaspook, Dale and 2 others
  • #8
I agree, but most of "AI" is not what people think of as AI. You say LLMs are like autofill on steroids. I agree. But a lot of AI inferences are statistics on steroids. Some of the most interesting work is on things we don't even consider as "intelligence" such as motion and balance.
 
  • Like
Likes Dale, russ_watters and Bystander
  • #9
I don't want this to become a debate on what qualifies as "AI" (separate thread maybe) as I don't actually care much about labels - I think in this case it's a pointless word game that can be avoided by discussing features instead of attaching lables. What matters though is how other people [mis]interpret it. I will say this:

I agree that ther are very interesting things that can be/are being done with large sets of data/statistics and that motion/balance control is interesting and likely useful, regardless of how it is done.

How-ever it is done and whatever it gets called, getting robots to play soccer successfully would be a great demonstration of a useful technology. Where it becomes scary is if the robots are smart enough to recognize that scoring goals is easier if they first rip the legs off their opponents, and it isn't forbidden by the rules. A LLM can already be pushed to recommend this, without the intelligence to figure that out on its own. And people are and will act on "advice" from LLMs.

I suppose that won't necessarily prevent a LLM from taking someone's job if the boss misinterprets its capability though.
 
  • #10
I think the takeaway message that I am hearing more and more is that AI will not or may not take your job but a person using AI can and will. So find out how to use AI to your advantage.
 
  • Like
Likes Ege Artan
  • #11
gleem said:
Since you are still very early on in your education it may take some time to appreciate how to use AI. Before you jump into it with both feet perhaps it would be best to get a feel for its use by perusing the latest articles in relevant journals that are using AI. If you are lucky maybe a group at your university is or will be using AI.
I see, what you are saying makes sense. I will start reading articles from such journals.
 
  • #12
russ_watters said:
I think it's worth expanding and reminding others that LLMs are basically an autofill on steroids, and they do not fit most peoples' vision of what AI is/does. Regardless of what the companies (marketteers?) call them. They are designed specifically for the purpose of writing plausible sounding communications, with a pleasant and accommodating tone.

We've had a number of people here claim to "develop a theory with the aid of AI" and what they produce is basically technobabble: plausible sounding gibberish. The LLM is not "thinking". It is not doing physics.

What's scary about it to me is so many people who use it seem to misunderstand it and use it for advice/decision making.
That is essentially what I tried to say. It seems like AI can be very helpful in solving/noticing very hard patterns that are not visible to humans. From what I've learned so far, it feels like it will be an incredible tool to fill in certain blunt work calculations while coming up with a theory. That is why I presume this ability will come in very handy during my future work. Am I right about what I've said, or are there some stuff that I've missed?
 
  • Like
Likes binis
  • #13
Vanadium 50 said:
I agree, but most of "AI" is not what people think of as AI. You say LLMs are like autofill on steroids. I agree. But a lot of AI inferences are statistics on steroids. Some of the most interesting work is on things we don't even consider as "intelligence" such as motion and balance.
So, could we say that AI as a tool will become a powerful statistical analysis tool for a theorist?
 
  • #14
gleem said:
I think the takeaway message that I am hearing more and more is that AI will not or may not take your job but a person using AI can and will. So find out how to use AI to your advantage.
How do you think I should be looking at stuff to find "that advantageous usage to AI"? I mean should I get involved with some institutions' events? Should I try to take some machine learning courses in my uni? At least, I am set on reading papers from journals, but what else should I be doing? In general, how does one get better at some field of physics, by doing what kind of stuff?
 
  • #15
Ege Artan said:
In general, how does one get better at some field of physics, by doing what kind of stuff?
At this stage focus on the foundational physics and math courses and don't become overly concerned with AI it could be a distraction at this point. Find your specific interest in an area of physics and then monitor the developments in that area. If AI is having an impact then find out which and how the AI models are being used. As I noted above it would be great if a group at your university was using AI so that you could talk to them.
 
  • Like
Likes DeBangis21
  • #16
Ege Artan said:
So, could we say that AI as a tool will become a powerful statistical analysis tool for a theorist?
Probably not. How many theorists are doing statistical analysis now?

You have decided AI is the solution without specifying the problem.
 
  • Like
Likes gleem

Similar threads

Replies
13
Views
3K
Replies
11
Views
3K
Replies
18
Views
910
Replies
37
Views
7K
Replies
5
Views
2K
Replies
10
Views
4K
Back
Top