Thermodynamic Computing: Better than Quantum? | Guillaume Verdon and Trevor McCourt, Extropic AI

Abundance, Abundance Society, AI -

Thermodynamic Computing: Better than Quantum? | Guillaume Verdon and Trevor McCourt, Extropic AI

Thermodynamic computing has the potential to revolutionize AI and machine learning by harnessing thermal fluctuations for faster, more efficient, and lower power computing systems 

Questions to inspire discussion

  • What is the future revolution in computing?

    The future revolution in computing is harnessing thermal fluctuations for physics-based computing systems.

  • What are the challenges of cooling quantum computers?

    Cooling quantum computers to zero temperature is a challenge, and finite temperature has an impact on hardware.

  • Why is deep learning a better alternative for representing probability distributions?

    Deep learning is a better alternative for representing probability distributions because digital computers use a lot of energy for sampling, which is inefficient and unnatural for deterministic devices.

  • What are the limitations of using only normal distributions in computing?

    Using only normal distributions in computing poses limitations, as more complicated distributions with longer tails and low likelihood events are not easily representable by classical computers.

  • What is the future of AI computing?

    The future of AI computing lies in thermodynamic computers, which can achieve probabilistic machine learning and optimization more efficiently and quickly than digital or quantum computers. 

 

Key Insights

Thermodynamic Computing Advantages and Potential

  • 🌌 Thermodynamic computing is the future, based on first principles of mathematics, information theory, probability theory, and physics.
  • 🧠 Thermo computing offers an alternative to quantum computing by utilizing the physics of matter at very cold temperatures, potentially revolutionizing computational tasks.
  • 🧠 Thermodynamic computing involves filling in the blanks with uncertainty and entropy, which can be a more efficient approach compared to traditional methods.
  • 🤔 Thermodynamic computing aims for substantial constant factor speedups, potentially worth the effort of rebuilding the whole stack from first principles.
  • ⚛️ From a thermodynamics standpoint, a thermodynamic computer may be more efficient and fast for probabilistic machine learning and optimization than digital or quantum computers.
  • 🌡️ Thermodynamic computing operates in a regime without quantum coherence, using similar building blocks to quantum computing but at room temperature, making it more accessible and practical.
  • 🔮 Energy-based models shape data distributions as equilibrium states called Boltzmann distributions, using programmable probabilistic computers with parameters that can be trained to morph the equilibrium distribution.
  • 🔥 The trick in designing something that's not just kind of noisy, but very noisy, is that you have to make sure that the noise is significant compared to the other energy scales in your device.
  • 🔮 The goal is to do as much as possible natively in probabilistic physics to minimize energy costs, unlike quantum computers which require observation and therefore energy.

Impact and Mission of Thermodynamic Computing

  • 🔮 The future of AI is very contrary and very different, but if it succeeds, it changes everything.
  • 🤯 "If you have an idea that you think is your greatest idea of your life, that you think is gonna have the most impact to helping civilization, you gotta go all in."
  • 🧠 "I feel that computing has to go this way. I've been thinking about noise and computing and how they might help each other, how they harm each other for basically my entire academic and adult life." - Trevor McCourt
  • 🔥 The mission of thermodynamic computing is to save the world and accelerate progress, giving a deep sense of satisfaction and near infinite energy. 

 

#Abundance #AbundanceSociety  #AI #ThermodynamicComputing

XMentions: @HabitatsDigital @PeterDiamandis @GillVerd @SalimIsmail @GoingBallistic5   @DrKnowItAll16 @Extropic_AI @trevormccrt1 @DavidOrban 

 

Clips 

  • 00:00 🔥 Harnessing thermal fluctuations for physics-based computing systems is the future revolution, contrasting with the challenges of cooling quantum computers and the potential for noise-based, lower power computing.
    • Harnessing thermal fluctuations for physics-based computing systems is the future revolution.
    • Guillaume Verdon and Trevor McCourt discuss the launch of Extropic's light paper and their new thermodynamic paradigm of computing, with Guillaume sharing his background in quantum computing and Trevor giving a brief bio.
    • An engineer got involved in quantum machine learning and worked on device engineering and modeling before joining Xtropic.
    • The speaker explains the contrast between quantum and thermodynamic computing, highlighting the challenges of cooling quantum computers to zero temperature and the impact of finite temperature on hardware.
    • Noise in quantum computing has been a challenge, with efforts focused on error correction and reducing entropy, but the road ahead for scaling up quantum computing is long.
    • The idea is to harness noise from the environment to create physics-based computing systems that are noisier and lower power than deterministic computers, as it is inevitable to go into a thermal or probabilistic regime when making computational devices smaller.
  • 11:44 🔥 Stochastic hardware and probabilistic algorithms based on thermodynamics will disrupt AI, making deep learning more efficient and effective.
    • To run a sampling algorithm on a digital computer, you need to generate pseudo-randomness using a circuit with complex dynamics, which requires a lot of entropy and electricity, and then filter the random bit stream to get useful samples.
    • Digital computers use a lot of energy for sampling, which is inefficient and unnatural for deterministic devices, so deep learning is a better alternative for representing probability distributions.
    • High-dimensional data requires many transformations to capture tail events, and current deep learning methods are not sufficient for reaching low data regime.
    • Using a probabilistic approach to fill in data with noise and uncertainty is costly and often avoided, leading to the use of old school neural nets and diffusion models.
    • Stochastic hardware and probabilistic algorithms based on thermodynamics will be disruptive and change the future of AI.
    • GPUs are good at deep learning because they excel at matrix multiplications, but accelerating only part of an algorithm will only result in a modest speedup, so it's more effective to disrupt the entire algorithm rather than just a subroutine.
  • 20:18 🔬 Thermodynamic computing can potentially surpass quantum computing in handling complex probability distributions, disrupting the limitations of classical and quantum computers in AI and machine learning.
    • The speaker discusses the use of Gaussian distributions in computing and the limitations of using only normal distributions, suggesting the need for different types of probability distributions with chips.
    • Gaussians are easily representable by classical computers, but more complicated distributions with longer tails and low likelihood events are not, posing a challenge for machine learning algorithms.
    • LLMs are limited in their ability to handle edge cases, and the constraints of deterministic hardware have held back AI, so proposing new hardware could disrupt how software and AI works.
    • Sampling directly from a high-dimensional distribution requires storing a large amount of data in memory, which grows exponentially with the number of dimensions.
    • Representing a general probability distribution in high dimensions is difficult on a classical computer, leading to the desire for quantum computers, but quantum computers are not necessarily good for probabilistic inference.
    • Rockets are less reliable than shipping something across town.
  • 26:40 🔥 Thermodynamic computing offers substantial speedups without the complexity and cost of quantum computing, with the potential for mass production at room temperature in the future.
    • Quantum computing may offer slight speed advantages for certain tasks, but the complexity and cost make it not worth the effort, while thermodynamic computing can provide substantial speedups without violating complexity theory.
    • The future of AI is not limited to current labs and quantum computing has limitations that need to be considered.
    • Quantum computing has limited practical advantages due to the lack of long-range quantum coherent effects in phenomena important to humans, and the challenges involved in building quantum computers are extremely formidable.
    • The future of computing lies in thermodynamic computers, which can achieve probabilistic machine learning and optimization more efficiently and quickly than digital or quantum computers.
    • The speakers left secret labs at Google and in Santa Barbara to join forces and develop a thermodynamic computer that can be built using existing circuit manufacturing techniques, operating at room temperature and large scale manufacturability.
    • Thermodynamic computing using superconducting chips allows for fast and efficient neural computing, with the potential for mass production at room temperature in the future.
  • 41:23 🔥 Energy-based models connect machine learning and thermodynamic processes, using circuits with thermal noise to create programmable sampling machines, aiming to reduce energy costs and efficiently translate information from the thermal system to the classical regime.
    • Energy-based models are used to model data distributions as equilibrium states, with programmable parameters that can be trained to morph the equilibrium distribution, connecting machine learning and thermodynamic processes.
    • Circuits experience thermal noise, and by designing a device with significant noise compared to other energy scales, you can create a programmable sampling machine using tunable circuit components, with the input being weights or parameters and the output being data.
    • Applying voltage to a circuit changes its behavior and observing the random dynamics of the circuit over time allows for inference of the distribution of values.
    • The goal is to do as much as possible natively in probabilistic physics to reduce energy costs, which is a challenge for quantum computers.
    • The challenge in thermodynamic computing is to efficiently translate the information from the thermal system to the classical regime without losing accuracy, similar to the readout problem in quantum computing.
    • The idea is to use physics as a physical process in the device to amplify signals and reduce noise in a CMOS package.
  • 50:40 🔥 Thermodynamic computing overcomes quantum coherence issues, embraces noise and thermalization for faster algorithms, and holds the future of AI computing with confidence in its success.
    • Thermodynamic computing does not experience quantum coherence issues and is not limited by quantum effects in transistors, but rather by the size of the device and the potential for metastable systems.
    • Noise and thermalization can actually help algorithms go faster, so instead of trying to extend coherence times, we should embrace and use the natural tendency to thermalize as a building block for our algorithms.
    • The speakers have progressed from a glimmer of an idea to actually building chips and have gained more confidence in the success of thermodynamic computing.
    • The future of computing for AI lies in the intersection of probabilistic machine learning and stochastic electronics, with confidence built from years of investigation and team collaboration.
    • The speaker discusses the idea of quantum computing and the importance of fully committing to a groundbreaking idea.
    • The speakers discuss the acceleration of their work in thermodynamic computing and the inevitability of the direction computing is heading in.
  • 57:39 🔬 Thermodynamic computing explores the potential of building modular physics-based devices using superconducting metals and silicon, aiming to surpass the energy efficiency and density of the human brain for AI applications.
    • The speakers are exploring the potential of thermodynamic computing as an alternative to quantum computing, with a focus on building modular physics-based devices using superconducting metals and silicon.
    • Computing is evolving to embed math into physical processes, with a focus on bridging the gap between algorithms and the physics of the device to maximize performance.
    • The speaker discusses the development of neuromorphic devices that leverage out-of-equilibrium thermodynamics for probabilistic machine learning, aiming to surpass the energy efficiency and density of the human brain and support current deep learning and machine learning applications.
    • The speaker discusses the potential of thermodynamic computing and its advantages over quantum computing in terms of harnessing compute from nature and the potential for new algorithms in the AI space.
    • The speaker discusses the development of thermodynamic computing and invites talented individuals to join their mission.
    • The speakers discussed first principles and the possibility of publishing a full white paper in the future.
  • 01:12:36 👍 The speaker expresses gratitude and appreciation.

    -------------------------------------

    Duration: 1:12:58

    Publication Date: 2024-06-30T23:50:55Z

    WatchUrl:https://www.youtube.com/watch?v=OwDWOtFNsKQ

    -------------------------------------


    0 comments

    Leave a comment

    #WebChat .container iframe{ width: 100%; height: 100vh; }