Here is a really interesting seminar series, "Physics Meets ML."
We are pleased to announce Physics ∩ ML, an online seminar series at the interface of theoretical physics and machine learning that builds on a 2019 meeting at Microsoft Research of the same name, read "Physics Meets ML."
We've been interested in going virtual for some time as interest in the interface between theoretical physics and ML continues to grow, but it seems particularly timely now given the coronavirus pandemic. Though content types may branch out over time, we will begin with four seminars at the interface of the subjects. Details regarding the first four seminars are:
- May 6: Taco Cohen, Qualcomm AI Research, Neural Graph Networks.
- May 20: Phiala Shanahan, MIT, Building Symmetries into Generative Flow Models.
- June 3: Ard Louis, Oxford, Why do neural networks generalise in the overparameterised regime?
- June 17: Koji Hashimoto, Osaka University, Deep Learning and Quantum Gravity.
All seminars will take place at 12:00 EST, with viewing information sent to Physics ∩ ML mailing list a few days before each seminar. You can sign up for the mailing list here or via visiting the website.
We wish you good health during these difficult times, and hope to see you virtually.
Jim Halverson, Northeastern University
Sven Krippendorf, LMU Munich
Fabian Ruehle, CERN,
Gary Shiu, University of Wisconsin
Greg Yang, Microsoft Research
Please join us this Wed 3/4/20 at 1 pm in DeMeritt Hall Rm. 251 for the latest seminar on machine learning in science.
Dr. Matthew Argall will be talking about Neural Networks:
Machine learning often gets a bad rap for being a “black box”. Data goes in, an answer comes out, and what happens in between is magic. Perhaps no other machine learning algorithm suffers from the black box reputation more than the neural network. In this seminar, we will demystify the neural network by building one from scratch. The concepts of “activation function”, “perceptron”, “feed forward”, and “backward propagation” are introduced. I then present an overview of many of the different types of neural networks that exist. Finally, two applications are presented: a Convolutional Neural Network (CNN) to identify exoplanets and a Long-Short Term Memory (LSTM) Recurrent Neural Network (RNN) to classify plasma boundaries with NASA’s Magnetospheric Multiscale mission.
The jupyter notebook can be downloaded in advance here: https://chapmanlab.github.io/ML/
See you there!
Join us for the third installment of the Machine Learning for Physical Science and Engineering Seminar Series. This is a great introduction for the true beginner looking for a place to start applying machine learning to your specific data.
"Machine Learning: A Beginner's Guide" presented by Dr. Matthew Argall, UNH
This seminar will present a high-level overview of machine learning (ML). It attempts to answer three big questions for beginners: What is it? What
models are out there? and How can I get started? To answer the first question, I start with a definition of ML and cover some early milestones that helped lead to the state of ML today. For the second question, I discuss the different branches of ML, including supervised and unsupervised learning, and present a flow chart that associates ML models with each branch of ML and provides a path between your dataset and applicable models. For the third question, I cover common tools in Python and R, then provide a conceptual overview of common ML models and pair them with examples from the literature in my field, space physics.
Please join us in DeMeritt Hall, Room 251, 1:00-2:00 pm.
Here are some great videos from last year's Deep Learning for Science School at Berkeley Lab, mentioned in the previous post. (Thanks Amy and Victor)
This looks like a great opportunity to immerse yourself in deep learning.
Hosted by Computing Sciences at Berkeley Lab, the school brings together researchers and engineers for lectures and tutorials on state-of-the-art deep learning methods and best practices for running deep learning on high performance computing systems. The sessions will cover both theory and practice, with emphasis on the latter. Attendees will gain an understanding of: what deep learning is, what type of problems it is good for, and how to choose, build and train (and deploy) at scale deep learning models for scientific applications. The school will also provide ample opportunities for attendees to connect with fellow scientists with a shared interest for discussions on how the latest advances in learning algorithms can be used for their science.
Join us for the second tutorial in the series, "Nonlinear regression using Bayesian inference" presented by Mr. John Donaghy, Department of Physics, UNH.
Abstract: This seminar will cover the gaussian process. A gaussian process is tool capable of extracting nonlinear patterns from data using Bayesian inference. We will begin by presenting the short comings of last weeks linear regression as motivation and proceed with an overview of the gaussian process ending with a brief hands on tutorial using scikit-learn.
DeMeritt Hall Room 251, 1:10-2:00 pm
You can find all files related to the seminars at chapmanlab.github.io/ML/. We will try to post any files needed for a seminar beforehand, and will post lectures and any other relevant files there afterward.
This series of tutorials, workshops, and lectures is aimed at exposing UNH scientists to the fundamental concepts and capabilities of modern machine learning techniques. The goal is to show all UNH scientists how machine learning may enhance their research. The schedule is evolving, but will start with very general introductions for an audience that is assumed to know nothing about machine learning.
The first lecture will take place on Wed, Feb. 5 2020 at 1 pm in DeMeritt Hall Rm 251
In preparation for the first lecture, attendees are encouraged to set up a working python environment on their laptop that can be used during the tutorials. To Do so follow these steps:
Stay tuned here for updates. If you have questions or suggestions for topics that you'd like to see covered, please contact firstname.lastname@example.org