s1.sargarmi.af - /drive3/Education/F04/Udemy.Modern.C.sharp.Microservices.with.Docker.K8s.and.AI/11 - AI LLM Service Foundations/


[To Parent Directory]

2/7/2026 7:46 AM 13498122 79. What Is an LLM Service.mp4
2/7/2026 7:46 AM 10390560 80. Prompt Engineering Basics.mp4
2/7/2026 7:46 AM 12573049 81. Temperature, Max Tokens, System Role.mp4
2/7/2026 7:46 AM 6586563 82. Local LLMs vs Cloud APIs.mp4
2/7/2026 7:46 AM 7628603 83. How to Run a Local LLM via Docker.mp4
2/7/2026 7:46 AM 9732038 84. Optimizing AI Response Time.mp4
2/7/2026 7:46 AM 11372591 85. Caching AI Responses.mp4
2/7/2026 7:46 AM 8709537 86. Logging Prompts & Outputs.mp4
2/7/2026 7:46 AM 8958120 87. AI Safety & Rate Limiting.mp4
2/7/2026 7:46 AM 11648450 88. Real Use Case Evaluate Picture Text Microservice.mp4