"Our tailored course provided a well rounded introduction. It covered topics that we needed to know. The instructor genuinely cared about our learning. We felt supported from start to finish and left with knowledge that truly mattered to our work." Brian Leek, Data Analyst, May 2024
IT professionals and cloud engineers who manage or support AWS environments.
Developers and software engineers building applications that use generative AI.
Data engineers, data scientists, and AI practitioners exploring LLMs and RAG solutions.
Solution architects and technical leads designing AI-driven systems on AWS.
Product managers and IT leaders evaluating generative AI use cases and ROI.
Business analysts and automation teams identifying opportunities for AI-enabled workflows.
"Our tailored course provided a well rounded introduction. It covered topics that we needed to know. The instructor genuinely cared about our learning. We felt supported from start to finish and left with knowledge that truly mattered to our work." Brian Leek, Data Analyst, May 2024
“JBI did a great job of customizing their syllabus to suit our business needs and also bringing our team up to speed on the current best practices. ” Brian F, Team Lead, RBS, Data Analysis Course, 20 April 2022
Sign up for the JBI Training newsletter to receive technology tips directly from our instructors - Analytics, AI, ML, DevOps, Web, Backend and Security.
This course provides a comprehensive, hands-on introduction to building, integrating, and operationalizing generative AI solutions using AWS technologies. Participants will learn how Large Language Models (LLMs) work, how to leverage AWS Bedrock and related AI services, and how to implement secure, scalable, and cost-efficient generative AI applications tailored for IT operations and enterprise environments.
Through a combination of conceptual instruction, architectural walkthroughs, and practical labs, learners will gain the skills needed to evaluate generative AI opportunities, build working prototypes, integrate models into existing systems, and plan real-world implementation projects.
Generative AI creates new content—such as text, images, or code—based on patterns learned during training. Traditional AI typically focuses on prediction, classification, or detection. Generative models like LLMs can produce human-like outputs and support advanced reasoning tasks.
An LLM is a deep learning model trained on massive amounts of text data to understand and generate natural language. Examples include Anthropic Claude and Meta Llama, which are accessible through AWS Bedrock.
AWS Bedrock is a fully managed service that provides access to leading foundation models through a simple API. It allows you to use generative AI without managing infrastructure or training large models yourself.
No. Bedrock provides ready-to-use models that you can integrate through prompts and APIs. Basic AWS and programming experience is helpful but deep ML expertise is not required.
AWS Bedrock offers multiple families of foundation models, including:
Anthropic Claude (text reasoning, conversation, analysis)
Meta Llama (general-purpose and code generation)
Other models for text, image generation, embedding, and more.
Prompt engineering is the practice of structuring inputs to guide an LLM’s output. Good prompts improve accuracy, reduce hallucinations, and optimize performance, especially for enterprise use cases.
Yes. You can integrate Bedrock with Lambda, API Gateway, S3, OpenSearch, and other AWS services to build chatbots, document analysis tools, automation systems, and RAG applications.
RAG combines LLMs with external data retrieval. It allows an LLM to reference your company’s documents or databases, enabling accurate and context-aware responses using up-to-date information.
AWS provides IAM-based access control, private networking options, encryption, audit logging, and secure API endpoints. You retain full control over your data and permissions when using Bedrock.
Costs are typically based on model usage—measured in tokens processed—and the specific model chosen. Optimizing prompts, selecting efficient models, and using caching strategies can help reduce costs.
No. AWS Bedrock provides pre-trained foundation models. If needed, Amazon SageMaker can be used for fine-tuning or training custom models, but it's optional.
You can use any AWS-supported SDK, including Python, JavaScript/TypeScript, Java, Go, and others, to call Bedrock APIs and build applications.
CONTACT
+44 (0)20 8446 7555
Copyright © 2025 JBI Training. All Rights Reserved.
JB International Training Ltd - Company Registration Number: 08458005
Registered Address: Wohl Enterprise Hub, 2B Redbourne Avenue, London, N3 2BS
Modern Slavery Statement & Corporate Policies | Terms & Conditions | Contact Us
POPULAR
AI training courses CoPilot training course
Threat modelling training course Python for data analysts training course
Power BI training course Machine Learning training course
Spring Boot Microservices training course Terraform training course