Azure Identity (AD,ADFS 2.0,AAD,ADB2C,OAuth,OpenID,PingID,AD Custom Policies) ,
Azure PaaS (Azure Functions, Serverless computing, Azure Comsos DB, Webhooks, API Apps, Logic Apps, Kudu, Azure Websites), Azure Functions, Lamda Function, Event Functions, Serverless architecture, Implementing azure functions on GIT HUB comment feature, Why Azure Functions, Azure Virtual Machines, Azure Cloud Services, Azure Web Apps & WebJobs, Service Fabric, Consumption Plans, Billing Model, Benefits of Azure Functions, What is serverless, Implementing bigger solutions into smaller azure functions, Microservices, Use cases, Function App, Implementation storing unstructured data using Azure functions into Cosmos DB, Cosmos DB, Custom Azure functions, Azure Cosmos DB, IOTS, Document DB, Doc DB, How to setup a Jenkins build server and automatically trigger code from Visual studio online,Azure App Service, App service Environment, Azure Stack, Managing Azure App services, Azure Powershell, Azure CLI, REST APIS, Azure Portal, Templates, Kudu Console access, Run GIT Commands on Kudu Console, Locking Azure Resources, Configuring Custom Domains, Adding Extensions to Azure Web App/Websites, App service Deployment options, Data Services in Azure , Azure SQL, Azure SQL server, Azure SQL database vs SQL server in a Azure VM, SQL Tiers, DTU, Data Transactional Unit, Planning & provisioning azure SQL databases,Migrating SQL Databases, Azure SQL Server, SQL server transactional replication, Deploy database to Microsoft Azure Database Wizard, DAC package, DAC, SQL compatibility issues, Migrating SQL with downtime, DMA, Data Migration Assistant, Database Snapshot, Migrating SQL without downtime, DTU, Data Transactional Unit, Recommendations for best performance during SQL Import Process, Transactional Replication, T-SQL, Task to implement what ever you learnt till now,
22. Azure Functions : Github Webhook Trrigers functions whenever comment is issued
Azure Functions : Store unstructured data using Azure Functions and Cosmos DBDemo
23. Get function URL: https://myfunctionappp.azurewebsites.net/api/Github-Webhook-JS?clientId=default
GitHub Secret : M4ZU1EmyWrZ/jimZkg*****************************ZJR9jg==
Azure Functions : Webhook + GITHub
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-github-webhook-triggered-function
Learn how to create a function that is triggered by an HTTP webhook request with a GitHub-specific payload
24. Add the details in your own GitHub
Repo :
https://github.com/TheAzureGuy007/
The-Azure-Guy-Repo/
settings/hooks/new
27. Azure Cosmos DB
Globally distributed mission-critical applications
Guarantee access to users around the world with the high-availability and low-latency capabilities built into
Microsoft’s global datacenters.
28. Azure Cosmos DB
IoT
Instantly, elastically scale to accommodate diverse and unpredictable IoT workloads without sacrificing ingestion or query
performance.
37. Application Platform Offerings
There three different methodologies you have available for running your applications on the Microsoft platform, and
they each give different levels of control and autonomy, yet still provide the flexibility, availability and cost savings
that are associated with PaaS workloads.
•Azure App Service: allows you leverage the benefits of a PaaS solution
•App Service Environment: allows you to leverage Azure App Services in a more isolated and salable environment
•Azure Stack: allows you have Azure in your data center and use the same controls and architecture as Azure while
still maintaining your on-premises control. It can help you prepare move more seamlessly into Azure if you wish to
do so as they are the same environment, one in your data center and one in the cloud.
38. Managing Azure Apps Services
Management Tools
There are a variety of tools you can use to interact and manage Azure App Services, Below are the main ones that
are typically available throughout Azure that are also applicable with Azure App Services
•Azure PowerShell: set of modules providing PowerShell cmdlets. Can be run on Windows, Linux or MacOS
•Azure Command Line Interface (CLI):open source command line shell. Can be run on Windows, Linux of MacOS
•REST APIs: REST based APIs for Azure resource manager (ARM)
•Templates: Resource Manager templates, to define resource objects and automate deployment and configuration
•Azure Portal:
• New Portal: portal.azure.com
• Classic Portal: manage.windowsazure.com
40. Managing Azure Apps Services
U can run Curl (File transfer cmd’s) or Git
commdands
41. Locking Resources
Creating a Lock
In Azure App Services It is possible lock a subscription, resource group, or resource such as your web application, to
prevent other users from deleting or modifying it. You can set the lock level to
•CanNotDelete: means authorized users can still read and modify a resource, but they can't delete it.
•ReadOnly: means authorized users can read from a resource, but they can't delete it or perform any actions on it. The
permission on the resource is restricted to the Reader role.
Locks differ from Role Based Access control in that locks apply to all users and roles.
42. Configure Custom Domain Name
It is possible to create your own custom domain name and use it with your Web App hosted on Azure App
Services. You can purchase a domain name directly through the Azure App services portal or you can bring
your own
43. Adding Site Extensions
Its possible to add extensions to your Web App. This can extend functionality and ease management overhead
and help in a number of different ways depending on your own requirements. There are a lot of extensions
available, calling out just a few here which help extend functionality and improve monitoring capabilities
Application insights: provides monitoring capabilities
New Relic: provides monitoring capabilities, more details on the project page
Php Manager: tool for managing php installations. More details are available on the PHP Manager Gitub page
Jekyll: Adds support for Jekyll on a Web App. More details are available on the Jekyll Site Extension Gitub page
44. App Service Deployment Options
App Service Deployment Options
There are a number of different options
available for deployment
Basic:
• FTP
• Web Deploy
Alternative:
• OneDrive/DropBox
• Kudu
Source Control / Continuous
Deployment
• Visual Studio Online
• Local Git
• GitHub
• BitBucket
You can use various tools as part of
these such as powerShell, Azure CLI
45. Data Services in Azure Overview
There are a number of different options to meet your needs for Data services in Azure.
• SQL Databases: Based on SQL sever, provides relational database server with scale, performance and
availability, as well as integration with existing on-premises SQL Server workloads for hybrid implementations.
• SQL data Warehouse: A combination of SQL server relational db with Azure cloud scale-out capabilities.
Suitable for enterprise, large scale workloads.
• Document DB: A schema free NoSQL database service, highly scalable and available.
• Table Storage: Stores structured NoSQL data with no schema. lower cost option than SQL. Could be suitable
for user data for web apps, address books, device information etc.
• Redis Cache: Provides access to a Redis cache, accessible by any application within Azure, providing high
throughput and low-latency for application requiring speed and scale.
• Data Factory: Manages movement and integration of data. Assists integrating different sources and different
types of data.
• Data Lake: A collection of services that allows for the storing, managing and analysis of large amounts of
data, getting the most out of the data you have
50. Service Tiers
There are three different Service Tiers to accommodate various workload requirements. All provide an up-time
SLA of 99.99% and hourly billing.
The service tiers are
• Basic: Suitable for small databases, and low volume needs
• Standard: Suitable for most cloud based apps
• Premium: Suitable for high transnational volumes with super critical workloads
Within each of these top level tiers, there are various performance levels available. It is possible to change
service tiers and performance levels dynamically
51. What is a DTU?
A Data Transnational Unit (DTU) is a
measure of the resources that are
guaranteed to be available to a
standalone
• Azure SQL database at a specific
performance level within a service Tier.
• It is a measure that combines CPU,
memory and I/O values.
• The larger the number the better the
performance, but this unit of measure
provides a way for you to see what
your overall performance levels are
and what your needs are, then being
able to relate that to cost.
An elastic DTU (eDTU) is a measure of the
resources across a set of databases, called an
elastic pool.
52.
53.
54. The general steps you should follow are
• Test for compatibility: validate the database compatibility
• Fix Compatibility issues if found
• Perform the migration
There are a number of options available to help with the process of migration depending on whether
you can afford some down time or not.
If you need minimal down time you can use SQL Server transactional replication replicate your data.
If you can accept some down time, some of which are
• Use the built in Deploy Database to Microsoft Azure Database Wizard
• Export to DAC package and ImportDAC package in Azure SQL
• If you just want the schema you can generate a script for entire database schema using Transact SQL
Migrating a SQL database
55. Method 1: Migration with downtime
1.Assess the database for any compatibility issues
using the latest version of Data Migration Assistant
(DMA).
2.Prepare any necessary fixes as Transact-SQL
scripts.
3.Make a transactionally consistent copy of the
source database being migrated - and ensure no
further changes are being made to the source
database (or you can manually apply any such
changes after the migration completes). There are
many methods to quiesce a database, from
disabling client connectivity to creating a database
snapshot.
4.Deploy the Transact-SQL scripts to apply the
fixes to the database copy.
5.Export the database copy to a .BACPAC file on a
local drive.
6.Import the .BACPAC file as a new Azure SQL
database using any of several BACPAC import
tools, with SQLPackage.exe being the
recommended tool for best performance.
56. The following list contains recommendations for best performance during the import process.
• Choose the highest service level and performance tier that your budget allows to maximize the
transfer performance. You can scale down after the migration completes to save money.
• Minimize the distance between your .BACPAC file and the destination data center.
• Disable auto-statistics during migration
• Partition tables and indexes
• Drop indexed views, and recreate them once finished
• Remove rarely queried historical data to another database and migrate this historical data to a
separate Azure SQL database. You can then query this historical data using elastic queries.
Optimizing data transfer performance during
migration
57. Method 2: Use Transactional Replication
1.Set up Distribution
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
2.Create Publication
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
3.Create Subscription
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
58.
59.
60.
61. Data Lake
What is Data Lake?
Data Lake is a batch, real-time, interactive data analysis tool. Data Lake makes it easy for developers, data
scientists, and analysts to store data of any size, shape and speed, and do all types of processing and
analytics across platforms and languages.
Azure Data Lake is a family of Azure services that enables you to analyze your big data workloads in a
managed manner.
It consists of these services:
• Azure Data Lake Store - A data repository that enables you to store any type of data in its raw format
without defining schema. The store offers unlimited storage with immediate read/write access to it and scaling
the throughput you need for your workloads. The store is Hadoop Data File System (HDFS) compatible so you
can use your existing tools.
• Azure Data Lake Analytics - An analytics service that allows you to run analysis jobs on data. Analytics using
Apache YARN to manage its resources for the processing engine. By using the U-SQL query language you can
process data from several data sources such as Azure Data Lake Store, Azure Blob Storage, Azure SQL
Database but also from other data stores built on HDFS.
• Azure Data Lake HDInsight - An analytics service that enables you to analyze data sets on a managed
cluster running open-source technologies such as Hadoop, Spark, Storm & HBase.
91. bash --login '/Applications/Docker/Docker Quickstart
Terminal.app/Contents/Resources/Scripts/start.sh'
Last login: Fri Sep 29 22:12:55 on ttys000
Girishs-Mac:~ girishkalamati$ bash --login
'/Applications/Docker/Docker Quickstart
Terminal.app/Contents/Resources/Scripts/start.sh'
## .
## ## ## ==
## ## ## ## ## ===
/"""""""""""""""""___/ ===
~~~ {~~ ~~~~ ~~~ ~~~~ ~~~ ~ / ===- ~~~
______ o __/
__/
___________/
docker is configured to use the default machine with IP
192.168.99.100
For help getting started, check out the docs at
https://docs.docker.com
Girishs-Mac:~ girishkalamati$ docker-machine ls
Running Docker Machine Commands in QuickStart Terminal
92. Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREATED
STATUS PORTS NAMES
Girishs-Mac:~ girishkalamati$
Running Docker Machine Commands in Normal Bash
93.
94.
95. Girishs-Mac:~ girishkalamati$ docker pull hello-world
Using default tag: latest
latest: Pulling from library/hello-world
Digest:
sha256:b2ba691d8aac9e5ac3644c0788e3d3823f9e97f75
7f01d2ddc6eb5458df9d801
Status: Image is up to date for hello-world:latest
Girishs-Mac:~ girishkalamati$ docker images
REPOSITORY TAG IMAGE
ID CREATED SIZE
hello-world latest 05a3bd381fc2 2 weeks
ago 1.84kB
Girishs-Mac:~ girishkalamati$ docker run hello-world
Hello from Docker!
This message shows that your installation appears to be
working correctly.
Running Docker Client Commands
96.
97.
98. docker is configured to use the default machine with IP
192.168.99.100
For help getting started, check out the docs at
https://docs.docker.com
Girishs-Mac:~ girishkalamati$ docker ps -a
CONTAINER
ID IMAGE COMMAND CREATED
STATUS PORTS NAMES
Girishs-Mac:~ girishkalamati$ docker images
REPOSITORY TAG IMAGE
ID CREATED SIZE
hello-world latest 05a3bd381fc2 2 weeks
ago 1.84kB
Girishs-Mac:~ girishkalamati$ docker rmi 05a3
Untagged: hello-world:latest
Untagged: hello-
world@sha256:b2ba691d8aac9e5ac3644c0788e3d3823f
9e97f757f01d2ddc6eb5458df9d801
Deleted:
sha256:05a3bd381fc2470695a35f230afefd7bf978b56625
3199c4ae5cc96fafa29b37
Deleted:
99.
100. Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREAT
ED STATUS PORTS NAMES
0f8a3ec0d161 kitematic/hello-world-nginx "sh
/start.sh" 16 minutes ago Up 16
minutes 0.0.0.0:80->80/tcp trusting_swanson
Running Docker Client Commands
120. Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREAT
ED STATUS PORTS NAMES
0f8a3ec0d161 kitematic/hello-world-nginx "sh
/start.sh" 16 minutes ago Up 16
minutes 0.0.0.0:80->80/tcp trusting_swanson
Running Docker Client Commands
121. PS C:UsersGirish> docker ps
error during connect: Get
http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.31/containers/js
on: open //./pipe/docker_engine: The
system cannot find the file specified. In the default daemon
configuration on Windows, the docker client must be run ele
vated to connect. This error may also indicate that the docker
daemon is not running.
PS C:UsersGirish> docker-machine env default
$Env:DOCKER_TLS_VERIFY = "1"
$Env:DOCKER_HOST = "tcp://192.168.99.100:2376"
$Env:DOCKER_CERT_PATH =
"C:UsersGirish.dockermachinemachinesdefault"
$Env:DOCKER_MACHINE_NAME = "default"
$Env:COMPOSE_CONVERT_WINDOWS_PATHS = "true"
# Run this command to configure your shell:
# & "C:Program FilesDocker Toolboxdocker-machine.exe"
env default | Invoke-Expression
PS C:UsersGirish> & "C:Program FilesDocker
Toolboxdocker-machine.exe" env default | Invoke-Expression
PS C:UsersGirish> docker ps
CONTAINER
ID IMAGE COMMAND CREATED STATU
S PORTS
157. Add Build step in build workflow
You need to add two Docker steps for each
image, one to build the image, and one to
push the image in the Azure container
registry
158. Bamboo (Continuous Delivery Tool)Build
Focus on coding and
count on Bamboo as
your CI and build
server! Create multi-
stage build plans, set
up triggers to start
builds upon commits,
and assign agents to
your critical builds
and deployments.
Test
Testing is a key part
of continuous
integration. Run
automated tests in
Bamboo to regress
your products
thoroughly with each
change. Parallel
automated tests
unleash the power of
Agile Development
and make catching
159. Bamboo (Continuous Delivery Tool)Deploy
Deployment projects
automate the tedium
right out of releasing
into each environment,
while letting you
control the flow with
per-environment
permissions.
Connect
Bamboo boasts the
best integration
with JIRA
Software, Bitbucket, Fis
heye, and HipChat.
Also, boost your CI
pipeline by choosing
from more than a
hundred fifty add-ons
in our
Marketplace or make
your own
161. JIRA (Development Tool)
Plan
Create user stories and
issues, plan sprints, and
distribute tasks across
your software team.
Track
Prioritize and discuss
your team's work in full
context with complete
visibility.
162. JIRA (Development Tool)
Release
Ship with confidence
and sanity knowing the
information you have is
always current.
Report
Improve team
performance based on
real-time, visual data
you can use.
Editor's Notes
Running Docker Codes in QuickStart Terminal
Running Docker Codes in Bash Terminal
Running Docker Codes in Bash Terminal
Docker run –p 8080:3000 -v /var/www node
Node == image name
It will create a container inside host … even it says Container volume