Synthetic Intelligence RSS
All-In Summit: Stephen Wolfram on computation, AI, and the nature of the universe
This talk was recorded live at the All-In Summit 2023 at Royce Hall on UCLA's campus in Los Angeles. (0:00) Dave welcomes Stephen Wolfram to All-In Summit ‘23! (2:37) Computational irreducibility (4:58) The paradox of simple heuristics (6:49) AI (8:57) Cellular automata (14:10) Limitations of AI (18:13) Syntax, logic, LLMs and other high-potential AI realms (23:57) Generative AI and interconcept space (26:20) The nature of the universe (29:54) Electrons – size, topology and structure (31:18) Time, spacetime, gravity and the boundaries of human observers (36:53) Persistence and other elements of consciousness humans take for granted (38:09) The concept of the...
Making Chat (ro)Bots
We created a robot tour guide using Spot integrated with Chat GPT and other AI models as a proof of concept for the robotics applications of foundational models. Learn more: https://bostondynamics.com/blog/robots-that-can-chat/ Project Team: Matt Klingensmith Michael Macdonald Radhika Agarwal Chris Allum Rosalind Shinkle #BostonDynamics #chatgpt 00:00: Introduction 01:08: Making Chat (ro)Bots 01:39: Precious Metal Cowgirl 02:38: A Robot Tour Guide 03:07: How does it work? 04:19: Shakespearean Time Traveler 04:36: Creating Personalities 04:54: "Josh" 05:23: Lateral Thinking 06:03: Teenage Robot 06:43: Nature Documentary 07:32: What's next? 2023-10-26T14:34:03Z1280 https://www.youtube.com/embed/djzOBZUFzTw
“What's wrong with LLMs and what we should be building instead” - Tom Dietterich - #VSCF2023
Thomas G. Dietterich is emeritus professor of computer science at Oregon State University. He is one of the pioneers of the field of machine learning. He served as executive editor of the journal called Machine Learning (1992–98) and helped co-found the Journal of Machine Learning Research. He is one of the members of our select valgrAI Scientific Council. Keynote: “What's wrong with LLMs and what we should be building instead” Abstract: Large Language Models provide a pre-trained foundation for training many interesting AI systems. However, they have many shortcomings. They are expensive to train and to update, their non-linguistic knowledge...
How to Use AutoGen & GPT-4 to Create Multiple AI Agents
In this AutoGen tutorial for beginners, you'll learn how to build a team of autonomous AI Agents powered by OpenAI's GPT-4. AutoGen is the cutting-edge framework for creating AI multi-agent Assistants, leaving behind its competitors such as MetaGPT or ChatDev. 🤝 Connect with me 🤝 LinkedIn: https://www.linkedin.com/in/kris-ograbek/ Medium: https://medium.com/@kris-ograbek +++ Useful Resources +++ Code: https://colab.research.google.com/drive/11HiXpnPNIN3WIJK76TG-tsraix_lhb0M?usp=sharing +++ Sources for Autogen +++ Docs: https://microsoft.github.io/autogen/docs/Examples/AutoGen-AgentChat GitHub: https://github.com/microsoft/autogen/tree/main Official Paper: https://arxiv.org/abs/2308.08155 Multi-agent Conversation Framework: https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat/ SDK: https://microsoft.github.io/autogen/docs/reference/agentchat/conversable_agent/ Chapters: 0:00 Intro 1:24 Feature 1: Complete flexibility 2:39 Feature 2: Human participation 4:04 Feature 3: Multi-agent conversations 5:06 Feature 4: Flexible autonomy 5:50 User Proxy Agent...
GOOGLE STUBBS - Google's answer to AutoGen and ChatDev that's powered by Gemini? | BREAKING AI NEWS
Get on my daily AI newsletter 🔥 https://natural20.beehiiv.com/subscribe [News, Research and Tutorials on AI] See more at: https://natural20.com/ My AI Playlist: https://www.youtube.com/playlist?list=PLb1th0f6y4XROkUAwkYhcHb7OY9yoGGZH 2023-10-25T17:50:32Z1280 https://www.youtube.com/embed/nCMfBWOfkqg