The PPT contains the following content:
1. What is Google Cloud Study Jam
2. What is Cloud Computing
3. Fundamentals of cloud computing
4. what is Generative AI
5. Fundamentals of Generative AI
6. Breif overview on Google Cloud Study Jam.
7. Networking Session.
How to fine-tune and develop your own large language model.pptxKnoldus Inc.
In this session, we will what are large language models, how we can fin-tune a pre-trained LLM with our data, including data preparation, model training, model evaluation.
Build an LLM-powered application using LangChain.pdfStephenAmell4
LangChain is an advanced framework that allows developers to create language model-powered applications. It provides a set of tools, components, and interfaces that make building LLM-based applications easier. With LangChain, managing interactions with language models, chaining together various components, and integrating resources like APIs and databases is a breeze. The platform includes a set of APIs that can be integrated into applications, allowing developers to add language processing capabilities without having to start from scratch.
Langchain Framework is an innovative approach to linguistic data processing, combining the principles of language sciences, blockchain technology, and artificial intelligence. This deck introduces the groundbreaking elements of the framework, detailing how it enhances security, transparency, and decentralization in language data management. It discusses its applications in various fields, including machine learning, translation services, content creation, and more. The deck also highlights its key features, such as immutability, peer-to-peer networks, and linguistic asset ownership, that could revolutionize how we handle linguistic data in the digital age.
The document provides an overview of the threats and opportunities of generative AI for businesses, outlining practical steps for adopting generative AI technologies including understanding the impacts on industries and business models, discovering opportunities to improve productivity and monetize assets, and starting the adoption journey with prioritized use cases and pilots.
This document discusses a coin sharing structure for translation services using a blockchain. It proposes recording token transactions, translation data leases, and database contribution information on the blockchain. Contributors would receive points based on their database contribution, and profits would be regularly shared. A Mother of Language platform would provide ready-to-use translation data and confirm data through consensus among point holders. The translation data could also be leased to linguistic AI companies to share profits. The performance of AI translators could improve by learning from specialized translation data sets tagged with metadata like author, translator, and language pairs.
Bliss AdTech provides IT staff augmentation and outsourcing services. They offer flexible engagement models including time and materials basis, augmentation with SLAs, fixed price rotation basis, and statement of work basis. Their services help clients stay focused on their projects while Bliss handles resource hiring and management. Bliss has expertise in various in-demand technologies and can source the right talent for clients' needs.
ChatGPT is a highly advanced tool that allows non-technical users access to powerful capabilities and reduces the time required to develop applications. Consider that we want to build a to-do list application using React Native. Now with the help of ChatGPT, we will start the development process.
"Increase Productivity with ChatGPT" is a comprehensive presentation that explores the benefits of using ChatGPT, to increase productivity in your working domain. From faster & accurate responses to quick task completion, discover how ChatGPT can help you to reach new levels of productivity and success.
How to fine-tune and develop your own large language model.pptxKnoldus Inc.
In this session, we will what are large language models, how we can fin-tune a pre-trained LLM with our data, including data preparation, model training, model evaluation.
Build an LLM-powered application using LangChain.pdfStephenAmell4
LangChain is an advanced framework that allows developers to create language model-powered applications. It provides a set of tools, components, and interfaces that make building LLM-based applications easier. With LangChain, managing interactions with language models, chaining together various components, and integrating resources like APIs and databases is a breeze. The platform includes a set of APIs that can be integrated into applications, allowing developers to add language processing capabilities without having to start from scratch.
Langchain Framework is an innovative approach to linguistic data processing, combining the principles of language sciences, blockchain technology, and artificial intelligence. This deck introduces the groundbreaking elements of the framework, detailing how it enhances security, transparency, and decentralization in language data management. It discusses its applications in various fields, including machine learning, translation services, content creation, and more. The deck also highlights its key features, such as immutability, peer-to-peer networks, and linguistic asset ownership, that could revolutionize how we handle linguistic data in the digital age.
The document provides an overview of the threats and opportunities of generative AI for businesses, outlining practical steps for adopting generative AI technologies including understanding the impacts on industries and business models, discovering opportunities to improve productivity and monetize assets, and starting the adoption journey with prioritized use cases and pilots.
This document discusses a coin sharing structure for translation services using a blockchain. It proposes recording token transactions, translation data leases, and database contribution information on the blockchain. Contributors would receive points based on their database contribution, and profits would be regularly shared. A Mother of Language platform would provide ready-to-use translation data and confirm data through consensus among point holders. The translation data could also be leased to linguistic AI companies to share profits. The performance of AI translators could improve by learning from specialized translation data sets tagged with metadata like author, translator, and language pairs.
Bliss AdTech provides IT staff augmentation and outsourcing services. They offer flexible engagement models including time and materials basis, augmentation with SLAs, fixed price rotation basis, and statement of work basis. Their services help clients stay focused on their projects while Bliss handles resource hiring and management. Bliss has expertise in various in-demand technologies and can source the right talent for clients' needs.
ChatGPT is a highly advanced tool that allows non-technical users access to powerful capabilities and reduces the time required to develop applications. Consider that we want to build a to-do list application using React Native. Now with the help of ChatGPT, we will start the development process.
"Increase Productivity with ChatGPT" is a comprehensive presentation that explores the benefits of using ChatGPT, to increase productivity in your working domain. From faster & accurate responses to quick task completion, discover how ChatGPT can help you to reach new levels of productivity and success.
This document provides a 50-hour roadmap for building large language model (LLM) applications. It introduces key concepts like text-based and image-based generative AI models, encoder-decoder models, attention mechanisms, and transformers. It then covers topics like intro to image generation, generative AI applications, embeddings, attention mechanisms, transformers, vector databases, semantic search, prompt engineering, fine-tuning foundation models, orchestration frameworks, autonomous agents, bias and fairness, and recommended LLM application projects. The document recommends several hands-on exercises and lists upcoming bootcamp dates and locations for learning to build LLM applications.
This talk overviews my background as a female data scientist, introduces many types of generative AI, discusses potential use cases, highlights the need for representation in generative AI, and showcases a few tools that currently exist.
This document provides information about a bootcamp to build applications using Large Language Models (LLMs). The bootcamp consists of 11 modules covering topics such as introduction to generative AI, text analytics techniques, neural network models for natural language processing, transformer models, embedding retrieval, semantic search, prompt engineering, fine-tuning LLMs, orchestration frameworks, the LangChain application platform, and a final project to build a custom LLM application. The bootcamp will be held in various locations and dates between September 2023 and January 2024.
Vertex AI brings all of the components of a production machine learning project into one platform in the cloud, based on Google's Kubeflow. It executes ML jobs through pipelines, a set of connected Docker images that perform different functions in the process of training and executing a machine learning model. In this session you will learn how to develop and deploy components of pipelines.
Conversational AI with Rasa - PyData WorkshopTom Bocklisch
Workshop building a simple chatbot with Rasa NLU and Core. Additional resources can be found in the repository https://github.com/tmbo/rasa-demo-pydata18/edit/master/README.md
Exploring Opportunities in the Generative AI Value Chain.pdfDung Hoang
The article "Exploring Opportunities in the Generative AI Value Chain" by McKinsey & Company's QuantumBlack provides insights into the value created by generative artificial intelligence (AI) and its potential applications.
Use Case Patterns for LLM Applications (1).pdfM Waleed Kadous
What are the "use case patterns" for deploying LLMs into production? Understanding these will allow you to spot "LLM-shaped" problems in your own industry.
The document discusses various use cases for learning ChatGPT through prompts provided in the book "The art of Prompt Engineering with ChatGPT". These use cases include brainstorming ideas in a table, translating a poem from Marathi to English, summarizing content for children, writing articles and blogs, academic writing, drafting emails, learning to code with Python, finding recipes based on available ingredients, and noting important points about ChatGPT's capabilities and limitations. The document provides examples of prompts and ChatGPT's responses for each use case.
Chat GPT 4 can pass the American state bar exam, but before you go expecting to see robot lawyers taking over the courtroom, hold your horses cowboys – we're not quite there yet. That being said, AI is becoming increasingly more human-like, and as a VC we need to start thinking about how this new wave of technology is going to affect the way we build and run businesses. What do we need to do differently? How can we make sure that our investment strategies are reflecting these changes? It's a brave new world out there, and we’ve got to keep the big picture in mind!
Sharing here with you what we at Cavalry Ventures found out during our Generative AI deep dive.
Details regarding the working of chatgpt and basic use cases can be found in this presentation. The presentation also contains details regarding other Open AI products and their useability. You can also find ways in which chatgpt can be implemented in existing App and websites.
Presenting the landscape of AI/ML in 2023 by introducing a quick summary of the last 10 years of its progress, current situation, and looking at things happening behind the scene.
LLMOps for Your Data: Best Practices to Ensure Safety, Quality, and CostAggregage
Join Shreya Rajpal, CEO of Guardrails AI, and Travis Addair, CTO of Predibase, in this exclusive webinar to learn all about leveraging the part of AI that constitutes your IP – your data – to build a defensible AI strategy for the future!
AI Basic, AI vs Machine Learning vs Deep Learning, AI Applications, Top 50 AI Game Changer Solutions, Advanced Analytics, Conversational Bots, Financial Services, Healthcare, Insurance, Manufacturing, Quality & Security, Retail, Social Impact, and Transportation & Logistics
Conversational AI and Chatbot IntegrationsCristina Vidu
Conversational AI and Chatbots (or rather - and more extensively - Virtual Agents) offer great benefits, especially in combination with technologies like RPA or IDP. Corneliu Niculite (Presales Director - EMEA @DRUID AI) and Roman Tobler (CEO @Routinuum & UiPath MVP) are discussing Conversational AI and why Virtual Agents play a significant role in modern ways of working. Moreover, Corneliu will be displaying how to build a Workflow and showcase an Accounts Payable Use Case, integrating DRUID and UiPath Robots.
📙 Agenda:
The focus of our meetup is around the following areas - with a lot of room to discuss and share experiences:
- What is "Conversational AI" and why do we need Chatbots (Virtual Agents);
- Deep-Dive to a DRUID-UiPath Integration via an Accounts Payable Use Case;
- Discussion, Q&A
Speakers:
👨🏻💻 Corneliu Niculite, Presales Director - EMEA DRUID AI
👨🏼💻 Roman Tobler, UiPath MVP, Co-Founder & CEO Routinuum GmbH
This session streamed live on March 8, 2023, 16:00 PM CET.
Check out our upcoming events at: community.uipath.com
Contact us at: community@uipath.com
Automate your Job and Business with ChatGPT #3 - Fundamentals of LLM/GPTAnant Corporation
This document provides an agenda for a full-day bootcamp on large language models (LLMs) like GPT-3. The bootcamp will cover fundamentals of machine learning and neural networks, the transformer architecture, how LLMs work, and popular LLMs beyond ChatGPT. The agenda includes sessions on LLM strategy and theory, design patterns for LLMs, no-code/code stacks for LLMs, and building a custom chatbot with an LLM and your own data.
Episode 2: The LLM / GPT / AI Prompt / Data Engineer RoadmapAnant Corporation
In this episode we'll discuss the different flavors of prompt engineering in the LLM/GPT space. According to your skill level you should be able to pick up at any of the following:
Leveling up with GPT
1: Use ChatGPT / GPT Powered Apps
2: Become a Prompt Engineer on ChatGPT/GPT
3: Use GPT API with NoCode Automation, App Builders
4: Create Workflows to Automate Tasks with NoCode
5: Use GPT API with Code, make your own APIs
6: Create Workflows to Automate Tasks with Code
7: Use GPT API with your Data / a Framework
8: Use GPT API with your Data / a Framework to Make your own APIs
9: Create Workflows to Automate Tasks with your Data /a Framework
10: Use Another LLM API other than GPT (Cohere, HuggingFace)
11: Use open source LLM models on your computer
12: Finetune / Build your own models
Series: Using AI / ChatGPT at Work - GPT Automation
Are you a small business owner or web developer interested in leveraging the power of GPT (Generative Pretrained Transformer) technology to enhance your business processes?
If so, Join us for a series of events focused on using GPT in business. Whether you're a small business owner or a web developer, you'll learn how to leverage GPT to improve your workflow and provide better services to your customers.
Prompt engineering is the practice of designing and refining specific text prompts to guide transformer-based language models, such as Large Language Models (LLMs), in generating desired outputs. It involves crafting clear and specific instructions and allowing the model sufficient time to process information. By carefully engineering prompts, practitioners can harness the capabilities of LLMs to achieve different goals.
The document discusses recent trends in the IT industry including cloud computing, mobile applications, Agile methodology (Scrum), N-tier architecture, and Java vs .NET. It covers topics such as cloud computing fundamentals and features, social networking sites, mobile app development and monetization, the shift from waterfall to Agile development, Scrum processes, Agile principles, N-tier architecture, advantages of both Java and .NET, and the continued relevance of object-oriented design.
This document provides a 50-hour roadmap for building large language model (LLM) applications. It introduces key concepts like text-based and image-based generative AI models, encoder-decoder models, attention mechanisms, and transformers. It then covers topics like intro to image generation, generative AI applications, embeddings, attention mechanisms, transformers, vector databases, semantic search, prompt engineering, fine-tuning foundation models, orchestration frameworks, autonomous agents, bias and fairness, and recommended LLM application projects. The document recommends several hands-on exercises and lists upcoming bootcamp dates and locations for learning to build LLM applications.
This talk overviews my background as a female data scientist, introduces many types of generative AI, discusses potential use cases, highlights the need for representation in generative AI, and showcases a few tools that currently exist.
This document provides information about a bootcamp to build applications using Large Language Models (LLMs). The bootcamp consists of 11 modules covering topics such as introduction to generative AI, text analytics techniques, neural network models for natural language processing, transformer models, embedding retrieval, semantic search, prompt engineering, fine-tuning LLMs, orchestration frameworks, the LangChain application platform, and a final project to build a custom LLM application. The bootcamp will be held in various locations and dates between September 2023 and January 2024.
Vertex AI brings all of the components of a production machine learning project into one platform in the cloud, based on Google's Kubeflow. It executes ML jobs through pipelines, a set of connected Docker images that perform different functions in the process of training and executing a machine learning model. In this session you will learn how to develop and deploy components of pipelines.
Conversational AI with Rasa - PyData WorkshopTom Bocklisch
Workshop building a simple chatbot with Rasa NLU and Core. Additional resources can be found in the repository https://github.com/tmbo/rasa-demo-pydata18/edit/master/README.md
Exploring Opportunities in the Generative AI Value Chain.pdfDung Hoang
The article "Exploring Opportunities in the Generative AI Value Chain" by McKinsey & Company's QuantumBlack provides insights into the value created by generative artificial intelligence (AI) and its potential applications.
Use Case Patterns for LLM Applications (1).pdfM Waleed Kadous
What are the "use case patterns" for deploying LLMs into production? Understanding these will allow you to spot "LLM-shaped" problems in your own industry.
The document discusses various use cases for learning ChatGPT through prompts provided in the book "The art of Prompt Engineering with ChatGPT". These use cases include brainstorming ideas in a table, translating a poem from Marathi to English, summarizing content for children, writing articles and blogs, academic writing, drafting emails, learning to code with Python, finding recipes based on available ingredients, and noting important points about ChatGPT's capabilities and limitations. The document provides examples of prompts and ChatGPT's responses for each use case.
Chat GPT 4 can pass the American state bar exam, but before you go expecting to see robot lawyers taking over the courtroom, hold your horses cowboys – we're not quite there yet. That being said, AI is becoming increasingly more human-like, and as a VC we need to start thinking about how this new wave of technology is going to affect the way we build and run businesses. What do we need to do differently? How can we make sure that our investment strategies are reflecting these changes? It's a brave new world out there, and we’ve got to keep the big picture in mind!
Sharing here with you what we at Cavalry Ventures found out during our Generative AI deep dive.
Details regarding the working of chatgpt and basic use cases can be found in this presentation. The presentation also contains details regarding other Open AI products and their useability. You can also find ways in which chatgpt can be implemented in existing App and websites.
Presenting the landscape of AI/ML in 2023 by introducing a quick summary of the last 10 years of its progress, current situation, and looking at things happening behind the scene.
LLMOps for Your Data: Best Practices to Ensure Safety, Quality, and CostAggregage
Join Shreya Rajpal, CEO of Guardrails AI, and Travis Addair, CTO of Predibase, in this exclusive webinar to learn all about leveraging the part of AI that constitutes your IP – your data – to build a defensible AI strategy for the future!
AI Basic, AI vs Machine Learning vs Deep Learning, AI Applications, Top 50 AI Game Changer Solutions, Advanced Analytics, Conversational Bots, Financial Services, Healthcare, Insurance, Manufacturing, Quality & Security, Retail, Social Impact, and Transportation & Logistics
Conversational AI and Chatbot IntegrationsCristina Vidu
Conversational AI and Chatbots (or rather - and more extensively - Virtual Agents) offer great benefits, especially in combination with technologies like RPA or IDP. Corneliu Niculite (Presales Director - EMEA @DRUID AI) and Roman Tobler (CEO @Routinuum & UiPath MVP) are discussing Conversational AI and why Virtual Agents play a significant role in modern ways of working. Moreover, Corneliu will be displaying how to build a Workflow and showcase an Accounts Payable Use Case, integrating DRUID and UiPath Robots.
📙 Agenda:
The focus of our meetup is around the following areas - with a lot of room to discuss and share experiences:
- What is "Conversational AI" and why do we need Chatbots (Virtual Agents);
- Deep-Dive to a DRUID-UiPath Integration via an Accounts Payable Use Case;
- Discussion, Q&A
Speakers:
👨🏻💻 Corneliu Niculite, Presales Director - EMEA DRUID AI
👨🏼💻 Roman Tobler, UiPath MVP, Co-Founder & CEO Routinuum GmbH
This session streamed live on March 8, 2023, 16:00 PM CET.
Check out our upcoming events at: community.uipath.com
Contact us at: community@uipath.com
Automate your Job and Business with ChatGPT #3 - Fundamentals of LLM/GPTAnant Corporation
This document provides an agenda for a full-day bootcamp on large language models (LLMs) like GPT-3. The bootcamp will cover fundamentals of machine learning and neural networks, the transformer architecture, how LLMs work, and popular LLMs beyond ChatGPT. The agenda includes sessions on LLM strategy and theory, design patterns for LLMs, no-code/code stacks for LLMs, and building a custom chatbot with an LLM and your own data.
Episode 2: The LLM / GPT / AI Prompt / Data Engineer RoadmapAnant Corporation
In this episode we'll discuss the different flavors of prompt engineering in the LLM/GPT space. According to your skill level you should be able to pick up at any of the following:
Leveling up with GPT
1: Use ChatGPT / GPT Powered Apps
2: Become a Prompt Engineer on ChatGPT/GPT
3: Use GPT API with NoCode Automation, App Builders
4: Create Workflows to Automate Tasks with NoCode
5: Use GPT API with Code, make your own APIs
6: Create Workflows to Automate Tasks with Code
7: Use GPT API with your Data / a Framework
8: Use GPT API with your Data / a Framework to Make your own APIs
9: Create Workflows to Automate Tasks with your Data /a Framework
10: Use Another LLM API other than GPT (Cohere, HuggingFace)
11: Use open source LLM models on your computer
12: Finetune / Build your own models
Series: Using AI / ChatGPT at Work - GPT Automation
Are you a small business owner or web developer interested in leveraging the power of GPT (Generative Pretrained Transformer) technology to enhance your business processes?
If so, Join us for a series of events focused on using GPT in business. Whether you're a small business owner or a web developer, you'll learn how to leverage GPT to improve your workflow and provide better services to your customers.
Prompt engineering is the practice of designing and refining specific text prompts to guide transformer-based language models, such as Large Language Models (LLMs), in generating desired outputs. It involves crafting clear and specific instructions and allowing the model sufficient time to process information. By carefully engineering prompts, practitioners can harness the capabilities of LLMs to achieve different goals.
The document discusses recent trends in the IT industry including cloud computing, mobile applications, Agile methodology (Scrum), N-tier architecture, and Java vs .NET. It covers topics such as cloud computing fundamentals and features, social networking sites, mobile app development and monetization, the shift from waterfall to Agile development, Scrum processes, Agile principles, N-tier architecture, advantages of both Java and .NET, and the continued relevance of object-oriented design.
Building what's next with google cloud's powerful infrastructureMediaAgility
Building What's Next with Google Cloud's Powerful Infrastructure. Companies are facing increasing challenges
Be more data driven, but on-prem data is hard to access, analyze, and use
Have to focus to stay ahead of competition, can’t afford wasted efforts
Attract and retain customers and employees with great experiences
Security threats keep growing
Be more agile - turn IT into competitive advantage
Google is focused on helping companies meet those challenges. To know more feel free to explore these slides and write back to us.
Lock in your calendars for October because GDSC GTBIT is about to drop the Google Cloud Jams bomb, and guess what? You're on the VIP list! 🌟🥳 Join us for an electrifying session hosted by tech virtuoso Mukal himself And upskill your learning journey using the Google Skill Boost Portal and conquer those certifications !
Gen AI Cognizant & AWS event presentation_12 Oct.pdfPhilipBasford
This document provides an overview of generative AI capabilities and architectures on AWS. It discusses the evolution of generative AI and some of its potential uses including generative search, smart data analytics assistance, text summarization, personalization, simulation, and automating routine tasks. It outlines several generative AI architectures available on AWS including Stable Diffusion, Claude, Jurassic-2z, Titan, Command & Embed, and models available through Hugging Face. The document discusses Amazon SageMaker and Amazon Bedrock as flagship services for foundational models on AWS. It also presents the Enterprise Knowledge Navigator solution for advanced question answering, retrieval-augmented generation, security, and interacting with data lakes. The document concludes with two case studies
This presentation covers the Orientation Cermony ppt of Google Developer Student Clubs that is organised in an Online mode, by the students of Noida Institute of Engineering and Technology.
This document discusses artificial intelligence and machine learning. It explains that deep learning is a key area of focus. It also outlines some fundamental components of an AI system including inputs, an inference engine, and outputs. Additionally, it discusses Google Cloud Platform's key AI components like TensorFlow, APIs for speech, video, vision, and translation. Finally, it provides examples of enterprise applications of AI in various industries.
201705 neoteric software development introMatt Kurleto
Introduction to software development services provided by Neoteric. Java, JavaScript, AngularJS, JointJS, NodeJS, MongoDB, Neo4j developers. Get a quote for free.
This document introduces Google Cloud Platform (GCP) services. It begins with an overview of traditional computing versus cloud computing, noting key differences like paying for assets versus paying for use. The document then outlines various GCP computing, storage, database, machine learning, networking, identity/security, and developer tools. It provides examples of companies using App Engine, such as Khan Academy and Snapchat. In the final sections, it encourages attendees to get started with GCP and provides a link to free trials and training labs.
Applying lean, dev ops, and cloud for better business outcomesKartik Kanakasabesan
1) Adopting DevOps, Lean, and cloud approaches can help government agencies deliver better citizen services with fewer resources by accelerating delivery of new features and getting faster feedback.
2) A DevOps approach involves applying Lean principles to get new ideas into production fast, get people to use new features, and get feedback in order to continuously improve. This allows agencies to change faster, which is an asset rather than an anchor.
3) Adopting cloud technologies helps remove bottlenecks around environment availability and provisioning, allowing standardized, lower cost, and faster delivery of applications and services.
This document discusses best practices for developing data science products at Philip Morris International (PMI). It covers:
- PMI's data science team of over 40 people across four hubs working on fraud prevention and other problems.
- Key principles for PMI's data science work, including being business-driven, investing in people, self-organizing, iterating to improve, and co-creating solutions.
- Challenges in data product development involving integrating work between data scientists and other teams, and practices like continuous integration/delivery to overcome these challenges.
- The role of data scientists in contributing code that is readable, testable, reusable, reproducible, and usable by other teams to integrate into
[Srijan Wednesday Webinars] How to Build a Cloud Native Platform for Enterpri...Srijan Technologies
Drupal has been a consistent leader in the Gartner Magic Quadrant for Web Content Management. However, enterprises leveraging Drupal have traditionally relied on PaaS providers for their hosting, scaling and lifecycle management. And that usually leads to enterprise applications being locked-in with a particular cloud or vendor.
As container and container orchestration technologies disrupt the cloud and platform landscape, there’s a clear way to avoid this state of affairs. In this webinar, we discuss why it's important to build a cloud-native Drupal platform, and exactly how to do that.
Join the webinar to understand how you can avoid vendor lock-in, and create a secure platform to manage, operate and scale your Drupal applications in a multi-cloud portable manner.
Key Takeaways:
- Why you need a cloud-native Drupal platform and how to build one
- How to craft an idiomatic development workflow
- Understanding infrastructure and cloud engineering - under the hood
- Demystifying the art and science of Docker and Kubernetes: deep dive into scaling the LAMP stack
- Exploring cost optimization and cloud governance
- Understand portability of applications
- A hands-on demo of how the platform works
How google cloud platform can benefit devops?VishnuAnji
The VisualPath Best Google Cloud Platform Training In Hyderabad is designed to enhance your skills. We have experienced trainers. Get Real-time exposure to technology. If you want to become an expert in GCP then Register Now@9989971070.
This document provides an overview and introduction to Azure Notebooks and Jupyter notebooks. It discusses what Jupyter notebooks are, how they can be used for tasks like data analysis, and how Azure Notebooks builds on Jupyter by providing a cloud-based notebook environment. The document then demonstrates various features of notebooks like code execution, markdown, and data visualization using examples. It also discusses where notebooks fit best versus other tools and environments.
Applying DevOps, PaaS and cloud for better citizen service outcomes - IBM Fe...Sanjeev Sharma
1) Applying DevOps practices like continuous integration/delivery can help government agencies deploy IT projects faster and get citizen services into production quicker.
2) Using a Platform as a Service (PaaS) like IBM Bluemix allows agencies to build and manage applications faster while reducing costs and skills requirements.
3) Adopting a DevOps culture and tools that automate testing, deployment, and monitoring can help agencies accelerate delivery of citizen services with better outcomes and less resources.
This document provides an introduction to cloud engineering and site reliability engineering (SRE). It begins with a brief history of cloud computing and its emergence in the 2000s with services from Amazon, Google, Microsoft and others. It describes the roles of operators in ensuring stability and developers in pursuing agility. It defines key concepts like DevOps, SRE, and cloud computing. It discusses how DevOps breaks down silos between operators and developers. It also provides examples to illustrate why cloud infrastructure is important and offers resources for learning more.
This document discusses how data science models have transitioned to the cloud to take advantage of greater computing resources. It notes that data science models are resource-intensive and traditionally required powerful local machines. The cloud allows data scientists to run models on cloud infrastructure for lower costs than high-end laptops and with access to many GPUs. Several major cloud platforms - Azure, AWS, and Google Cloud - are discussed and compared in terms of their machine learning offerings. The document also introduces Microsoft's Team Data Science Process, which aims to help data science teams collaborate more effectively on projects in the cloud.
This document summarizes a Dagster community meeting at Rohde & Schwarz in May 2021. It introduces Simon as the presenter and discusses how Rohde & Schwarz uses Dagster for their SmartAnalytics product. Key points include:
- Rohde & Schwarz uses Dagster to move their ETL pipelines to the cloud for improved scalability.
- Their Dagster implementation includes event-driven pipelines using sensors to monitor file uploads and listen to an S3 folder.
- Advantages of Dagster include replacing their custom ETL engine, improved troubleshooting with its error reporting, and easier code reuse through resources and solids.
- Next steps include adding tests, improving documentation, establishing best practices, and
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
How to Get CNIC Information System with Paksim Ga.pptx
Google cloud Study Jam 2023.pptx
1.
2.
3. Strap in, folks, and get ready to
buckle up because your host for
today is here to take you to the
world of the cloud. 🌥️🚀🌌
Narula Institute of technology
RAJ SAHA
Google DSC’23 Lead
6. "Cloud computing: Where
the future takes shape
one virtual byte at a
time." ☁️💻
Narula Institute of Technology
Rajdip Bhattacharya
Google DSC’23 Cloud & DevOps Lead
7. And why do we need to host it?
“A computer program or device that provides a service to another
computer program and its user, also known as the client”
We host servers to make them publicly/privately available to other
servers/users. There are three ways of hosting servers-
● On premise
● Cloud
● Hybrid
So what is a server, after all?
8. Pros of on-premise
● Control over server hardware
● Security of data
● Low latency
9. But there are cons as well
● Capital expenditure: Companies need to make a capital investment
to set up the server beforehand.
● Estimation problems: There can be overestimation/underestimation
in hardware choice, thereby leading to cost issues.
● Scalability: You can not just add another server or remove servers as
you want to.
● Increased management cost: Professional IT people are required to
maintain and monitor the servers all the time. If any component needs
replacement, the company has to bear the cost.
● Risk of data loss: In case a server holding business data crashes,
entire data would be lost.
11. Cloud computing is the on-demand
delivery of IT resources over the
Internet with pay-as-you-go pricing.
Instead of buying, owning, and
maintaining physical data centres and
servers, you can access technology
services, such as computing power,
storage, and databases, on an as-needed
basis from a cloud provider.
—Amazon Web Services (AWS)
12. ● Cost savings: Cloud uses the pay-as-you-go model, which means, we pay for only
what we use. This shifts us from capital expenditure (CapEx) to operational
expenditure (OpEx)
● No maintenance cost: The cloud providers are responsible for maintaining the
servers. Hence, we do not pay for it.
● Scalability and flexibility: We can provision more cloud resources or remove
them as and when needed.
● Data loss prevention: Cloud providers provides us with data backup and
recovery options so that we never lose our data.
Pros of cloud computing
13. Timeline of cloud computing
1960 1999 2002 2010
Introduction of
computers
Salesforce and VMWare
came into existence
Google App Engine was
launched
Emergence of the first
VM
1972 1977 1991 2008
The term “cloud” was
coined
The World Wide Web
was created
Amazon Web Services
(AWS) was launched
Microsoft Azure was
launched
14. Why switch to cloud computing?
● Cost reduction, flexibility, and reliability.
● Absolute control over resources.
● Out of the box support for AI, ML and
analytics.
● Increase team’s performance and
strategies.
● Support for remote workforce.
Kyu switch karna hai ?
15. When we think about using cloud computing, we would
want to know what best suits our purpose. These are
the three main models of cloud:
● Software as a service (SaaS)
● Platform as a service (PaaS)
● Infrastructure as a service (IaaS)
Different flavours of cloud
18. Pricing in cloud computing
Now that we are familiar with cloud computing, we should also understand a bit
about the pricing model. Any major cloud provider charges their customers in the
following ways,
● Free tier
● Pay-as-you-go
● Full upfront
● Partial upfront
● No upfront
While free tier and pay-as-you-go might be a good choice for experimentation, for
the long run, reservation of your hardware is recommended.
19. Whatever the cloud provider might be, they provide a particular set of
tools that can be categorized as follows:
1. Compute
2. Storage
3. Database
4. Containers
5. Serverless
6. Security and IAM
7. Machine learning
8. Analytics
9. Cloud SDK
Product Categories
20. Certifications are the best way to showcase your skills. They not only prove your
knowledge, but also makes you stand out in the crowd. Certifications in cloud are of 3
kinds,
● Foundation: Gets you started with the cloud platform
● Associate: Gets you comfortable working in the platform
● Professional: Gets you covered up with security and best practices
Just a quick note. Although certifications are a good thing to pursue and
showcase, they are by no means the way to judge someone. Remember, you don’t
judge a book by its cover :)
Cloud certifications
21. 😜 What do you call an AI that
tells jokes?
A "comedAIan"! 🎤🤣
Narula Institute Of Technology
Sambit Chakraborty
Google DSC’23 AI-ML Lead
22. ChatGPT use karte ho ????
🤣🤣🤣
PS: FOR SOLVING CODING PROBLEMS
Narula Institute of technology
23.
24. Timeline of cloud computing
1951 2018 2020
From 2022
onwards
Noam Chomsky
publishes
Syntactic
Structures
1. FastText is
developed
2. OpenAI released
GPT-2
1. OpenAI released GPT-
3
2. Google released
LaMDA
1. Word2Vec is
developed
2. GloVe is developed
3. seq2seq based GRU
introduced
2013 & 2014 2015 2017 2021
Transformer model
developed;
Google Translate
improved
1. Generative
adversarial networks
(GANs) are developed;
2. GEN AI term coined
1. Google AI released
BERT
2. Meena Model got
founded by Google
Narula Institute of technology
25. What is Generative AI ????????.....
Generative AI is a Deep Learning Algorithm which
generates sequence of text, audio, image and synthetic
data, based on user input, known as prompt.
These are backed by Generative Model algorithm like
Naive Bayes etc and GANs.
Narula Institute of technology
26. Large Language Models (LLMs)
● LLMs are a special type of AI algorithm that trained on literally massive dataset
of text.
● Trained Using Strongly Supervised Dataset annotation technique.
● What kind of tasks it can perform??
- Generating texts, Translation, Question-Answer, Summarization, Creative
Content.
● Once your model is on its first version. Just finetune it. You’re ready to solve
multiple problems.
Example: Transformer, BERT, CTRL,
BLOOM, T5, RoBERTa etc..
Narula Institute of technology
27. Foundation Models (FMs)
● FMs are improved version of LLMs that trained on massive complex data which
mixes text, code, image, video, audio. Data scales time by time.
● Trained Using Self Supervise & Semi Supervise annotation technique.
● What kind of tasks it can perform??
- More diverse fine tasks than traditional LLMs
● Little effort needs to finetune the model
Example: GPT, LaMDA, PaLM, Pathways System,
Gopher, LLaMa, SeamlessM4T, Falcon,
Midjourney, Dall-E etc.
Narula Institute of technology
29. It’s a process of designing and crafting prompts to guide large
language models (LLMs) to generate the desired outputs.
Prompt Engineering
It is a crucial skill for anyone who wants to use LLMs effectively.
- It brings significant impact and improvement on the
quality of the output.
Tips:
1. Be clear and concise
2. Provide Context, assign role, variety of sentence structures, rules
3. Be specific and formal
4. Use examples, keywords
5. Avoid leading to bias, usage of slangs
Narula Institute of technology
31. ChatGPT, BARD use karna
bandh kardo 🤮🤮
Narula Institute of technology
32. Enough is enough…no more definitions and…..NO MORE EMOTIONAL DAMAGE
Forget everything and let’s get into GOOGLE’S Inventory of GEN AI
33. is leader in this diverse field.
From introducing Transformer architecture which made
today’s GEN AI move possible for other orgs…to announcing
most versatile Language Model - PaLM, which actually
powerful than the GPT-4, when it SCALES
Narula Institute of technology
34. DUET AI
Elevate your working experience in Google Workspace
Gmail:
● Smart Compose
● Smart Reply
Google Docs:
● Smart Compose
● Idea Brainstorming
● Smart Rewrite
● Summarizer
Google Sheets:
● Smart Organizer
Google Slides:
● Custom Visuals
Google Meet:
● Portrait Restore
● Portrait Light
● Automated Transcription*
● Meeting Summary*
Google Space:
● Chat Summary
*Will be released later this year
Narula Institute of technology
35. Vertex AI got GEN AI essence
Vertex AI is a unified platform on GCP for building and deploying machine learning models.
Announced in Google I/O’23 event, Vertex AI now has Generative AI support.
- PaLM 2
- New Foundation Models
- Codey
- Imagen
- Chirp
- Generative AI Studio
- Gen App Builder
- Model Garden
- Embeddings API, RLHF
Narula Institute of technology
36. PaLM 2
Stands for Pathway Language Model v2, developed by Google AI, successor of original PaLM model.
Tasks it performs:
- Have literature excellency
- Generate Music
- Generate Code & Scripts
- Write Email and letters
- Question & Answer
- Text Summarization and information retrieval
- Creative Content
USP: Seamless translation of 100+ languages, AR & VR
integration support, Robust and Failsafe scaling by
surpassing the accuracy of GPT-4
Narula Institute of technology
37.
38. Codey
Primarily targeted for Google Colab Integration. Based on PaLM 2 Model
Tasks it performs:
- Generating Code
- Analyzing and fixing errors
- Code Completion
- Improve code quality with uniqueness
USP: Natural Language dependency, 20+ Programming
language support efficiently, Smart Suggestion
Developed to rival with GPT powered Github Copilot
Narula Institute of technology
39. Imagen
Text-to-image diffusion model by Google AI. Primarily target for business.
Tasks it performs:
- High res image generation
- Product Image creation
- AR & VR integration
- Consistent Image generation with context
- Edit Images with generative filling
USP: AR-VR integration, consistency
Developed to rival with Midjourney, DALL-E, Firefly
Narula Institute of technology
40. Chirp
speech-to-text model by Google AI.
Tasks it performs:
- Speech Recognition & Analysis
- Voice assistant
- Live Captioning
- Live Translation
- Transcription
- Text-to-speech generation
USP: 100+ Language support, 98% Accuracy in English, Simultaneous
10 million distinct support of speakers, improved relative
Developed to rival with Whisper, Polly, SeamlessM4T
42. Project IDX
Narula Institute of Technology
Project IDX is a new browser-based
development experience built on
Google Cloud and powered by Codey.
Scan this QR to explore project IDX
43. Narula Institute of Technology
So long, farewell!
Just like AI, life is all
about continuous
learning and growth.
Take care! 🌱👋
Image generated by Imagen & DUET AI
47. Narula Institute Of Technology
Sawan Bhattacharya
Google DSC’23 Tech Lead
Unleashing the power of
the cloud Study Jam, one
question at a time. ☁️🌟
Your Cloud Facilitator
here.