'neural networks' 검색 분석 결과
분석 대시보드
요약 통계
검색 결과 스레드 (점수 순 정렬)
콘텐츠를 보려면 로그인이 필요합니다
로그인하고 원하는 스레드 정보를 분석해보세요.
How a pasta machine explains neural networks
Created these Infographic Details about ML concepts like - Backpropagation - Bayesian Neural Network - Gradient Descent - Stochastic Gradient Descent
A simple breakdown of the modern AI stack: 🧠 LLM = Brain — Thinks and generates text 📚 RAG = Brain + Books — Retrieves trusted knowledge 🛠️ AI Agent = Brain + Hands — Takes actions using tools ⚡ MCP = Nervous System — Connects models, APIs, and systems In short: LLMs think, RAGs read, Agents act, MCP connects.
Meanwhile, we been cooking on NVIDIA Omniverse with neural networks.
My 2026 AI stack (for now…): 1. Claude: Writing that feels like me 2. NotebookLM: Deep learning compressed into minutes 3. Claude Code: Coding software when it really matters 4. Claude Cowork: Building apps while on the treadmill ;) 5. Grok: Researching when you need the most current 6. Manus: Getting sh*t done without doing it myself
When data scientists start a project, they always ask: "How does the output change when the input changes?" And while the equation y = ax+b answers it in the simplest way, neural networks try to answer it in a more flexible way. In this tutorial, Samyutka explains how neural networks work using y = ax +b. https://www.freecodecamp.org/news/neural-networks-explained-using-y-ax-b/
AI just learned to predict the future, not just react to it. Researchers trained neural networks to chase a moving target. Small networks just reacted. Big networks guessed where the target would go next. → Reactive nets track distance only → Predictive nets build world maps → Big networks wait at reappear spots → Live mice did the same thing Size of the neural code decides reflex vs planning.
Into neural networks, human behavior & emotional healing? Let’s connect 🔬💙
keisler-2022: open-weight “Forecasting Global Weather with Graph Neural Networks”. Highlights: • 10-day forecast in <1 min • Initialize forecasts from ERA5 or IFS analysis • Scripts for eval, sensitivities, & Hurricane Sandy https://github.com/rkeisler/keisler-2022
Goodfire AI's research Since the neural networks think in shapes, understanding their rich *neural geometry* is key to understanding how they work – and to debugging and controlling them with precision. The World Inside Neural Networks: https://www.goodfire.ai/research/the-world-inside-neural-networks#
📚 New 2026 eBook from Manning eBooks is everything you need to create neural networks with PyTorch, including large language and diffusion models. Dive into Deep Learning with PyTorch, Second Edition: 🔗 https://loom.ly/Rxpm5gg Subscription options: 💻 https://loom.ly/-F0YO94 IEEE #IEEEXplore #DeepLearning #MachineLearning #NeuralNetworks #GenerativeAI #eBooks
Dear Algo show me individuals who are speak about neuroplasticity, quantum physics, creating new neural networks and believe in God. People who are working on creating new ways to do old things and creating true transformation in their communities. Specifically in North Carolina.
A big request, please, who uses neural networks for coding and work: Tell me about the pros and cons of switching from Claude to ChatGPT
Most people don’t understand Backpropagation. Here’s the simplest way: Neural networks learn by: Making a prediction Calculating error Adjusting weights Repeat until accuracy improves. That’s it.
Multi Layer Perceptron Neural Networks playing Flappy Bird
Intro to Neural Networks. BigData #Analytics #DataScience #AI #MachineLearning #IoT #IIoT #PyTorch #Python #RStats #TensorFlow #Java #JavaScript #ReactJS #GoLang #CloudComputing #Serverless #DataScientist #Linux #Programming #Coding #100DaysofCode https://bit.ly/3uiGaG3
🚀 Artificial Neural Networks (ANNs) are playing a major role in transforming modern engineering and intelligent systems. Their capability to learn complex nonlinear relationships from data makes them highly effective for:• prediction• classification• pattern recognition• fault diagnosis• and intelligent decision-making applications. Advanced architectures and hybrid AI approaches continue to push the boundaries of intelligent predictive systems and engineering optimization.
The next major computing shift isn't in the cloud it's on local silicon, where Edge AI processes neural networks directly on-device, eliminating transmission latency and fundamentally decentralizing global computing power. 🧠⚡ #EdgeAI #MachineLearning #TechHardware
Generative #AI course by MIT🔥 🔗 Course link - https://m.youtube.com/playlist?list=PLXV9Vh2jYcjbnv67sXNDJiO8MWLA3ZJKR Topics include: → ChatGPT → Stable Diffusion & DALL·E → Neural Networks → Supervised Learning → Representation & Unsupervised Learning → Reinforcement Learning → Generative AI → Self-Supervised Learning → Foundation Models → GANs (Adversarial) → Contrastive Learning → Autoencoders → Denoising & Diffusion Follow @aimindscope for more AI insights
Ever wondered what's *really* happening under the hood in deep learning? This new theory unpacks the computational principles driving neural networks - and it's mind-blowing how complex yet elegant these systems are. 🧠🔍 AI isn't magic, it's math and architecture.