We're excited to announce Generative AI for Software Development, a new course, now available for pre-enrollment. Generative AI is reshaping developers' workflows, and this course offers a comprehensive pathway to understand and apply generative AI technologies in real-world software development. Taught by Laurence Moroney, Chief AI Scientist at VisionWorks Studios and former AI lead at Google, you’ll learn to use LLMs to assist with the core functions of a software developer or engineer, including code generation, optimization, debugging, and documentation, and you’ll enhance your efficiency and creativity through the tools and techniques you'll explore. 🔗 Integrate generative AI tools into your workflow. 🐛 Optimize and debug code with AI. 🚀 Develop advanced software solutions using AI. Generative AI for Software Development is available for pre-enrollment on Coursera now, and you can receive a certificate upon successful completion! Pre-enroll now and be the first to join: https://hubs.la/Q02GL_tx0
DeepLearning.AI
Software Development
Palo Alto, California 1,025,707 followers
Making world-class AI education accessible to everyone
About us
DeepLearning.AI is making a world-class AI education accessible to people around the globe. DeepLearning.AI was founded by Andrew Ng, a global leader in AI.
- Website
-
http://DeepLearning.AI
External link for DeepLearning.AI
- Industry
- Software Development
- Company size
- 11-50 employees
- Headquarters
- Palo Alto, California
- Type
- Privately Held
- Founded
- 2017
- Specialties
- Artificial Intelligence, Deep Learning, and Machine Learning
Products
DeepLearning.AI
Online Course Platforms
Learn the skills to start or advance your AI career | World-class education | Hands-on training | Collaborative community of peers and mentors.
Locations
-
Primary
2445 Faber Pl
Palo Alto, California 94303, US
Employees at DeepLearning.AI
Updates
-
Amazon hired most of Adept AI's leadership and staff to enhance its agentic AI and automation capabilities. This move aligns with Amazon's broader strategy to compete in AI, following its $4 billion investment in Anthropic and ongoing development of new AI models. Learn more in #TheBatch: https://hubs.la/Q02GGtBs0
-
DeepLearning.AI reposted this
We value your feedback and want to make our events even better for you. 📅 Could you spare a few minutes to fill out a quick survey? Your insights will help us create more enjoyable and beneficial experiences: https://hubs.la/Q02GznNF0 We can’t wait to see you at our next event!
-
-
We value your feedback and want to make our events even better for you. 📅 Could you spare a few minutes to fill out a quick survey? Your insights will help us create more enjoyable and beneficial experiences: https://hubs.la/Q02GznNF0 We can’t wait to see you at our next event!
-
-
The rise of AI is complicating efforts by Google and other major tech companies to meet their greenhouse gas emissions targets. Learn how these companies are performing and the actions they’re taking to tackle this challenge in #TheBatch: https://hubs.la/Q02Gtf2n0
AI and Data Center Boom Challenges Big Tech's Emissions Targets
deeplearning.ai
-
Initialization can have a significant impact on convergence in training deep neural networks. Simple initialization schemes can accelerate training, but they require some care to avoid common pitfalls. Read our guide to learn how to initialize neural network parameters effectively: https://hubs.la/Q02Gn9Fk0
-
Anthropic introduced Artifacts, which allows users to work on generated outputs as independent files on Claude. Find more details in #TheBatch: https://hubs.la/Q02GkffD0
Claude 3.5 Sonnet's Artifacts Makes It Easier to Build and Code On-Site
deeplearning.ai
-
Yesterday we launched Prompt Compression and Query Optimization, our latest short course, taught by MongoDB's Richmond Alake! 🔍 Combine vector search capabilities with traditional database operations to build efficient, cost-effective RAG applications. 🛠️ Learn key techniques like pre-filtering, post-filtering, and projection for faster query processing and optimized output. 💡 Reduce prompt lengths with prompt compression for large-scale applications. Enroll now and start optimizing your RAG applications: https://hubs.la/Q02GjnWS0
-
DeepLearning.AI reposted this
AI/GenAI Team Leader and IIT Chair at NielsenIQ | PhD in AI at UAH | Top 10 technologists under 35 years old in Spain (Nova111 winner) | 16x professional and academic awards | 12x patents | 33x publications
🔬 I love continuous learning. #RAG is one of the hot topics in #GenAI, so I decided to have a look into the new course by Andrew Ng (DeepLearning.AI) and Richmond Alake (MongoDB) about Prompt Compression and Query Optimization. 🚀 This course teaches you to combine traditional database capabilities with vector search using MongoDB for RAG. You'll learn these techniques: - Vector search: For semantic matching of user queries. - Filtering using metadata: Pre- and post-filtering to narrow search results. - Projections: Selecting only necessary fields to minimize data returned. - Boosting: Reranking results to improve relevance. - Prompt compression: Using a small LLM to compress context, significantly reducing token count and processing costs. 💡 If you want to enroll and complete this short course, I strongly recommend it, great contents. Follow this link to do it: https://lnkd.in/dJ_pYk3Y
-
-
This week in #TheBatch: 🎨 All about Claude’s Artifacts feature 🌱 AI growth hinders carbon emissions goals 🧠 GaLore, a memory-efficient method for fine-tuning and pretraining Plus: Andrew Ng shares his concerns about the proposed California regulation SB 1047. Read The Batch now: https://hubs.la/Q02G5YpM0
AI's Cloudy Path to Zero Emissions, Amazon's Agent Builders, and more
deeplearning.ai