Health checks API is one of the new features of ASP.NET Core 2.2 for application health monitoring. Health checks are exposed by ASP.NET Core 2.2 application as HTTP endpoints which enables liveness and readiness probes.
Health checks are usually used with an external monitoring service or container orchestrator to check the status of an app. In this article, I am going to share steps needed to configure Kubernetes Liveness and Readiness probes for an ASP.NET Core 2.2 web application deployed in Azure Kubernetes Service cluster.
The dev tools used to develop these components are Visual Studio for Mac/Visual Studio 2017 and Visual Studio Code. AKS Dashboard as well as kubectl commands are used to create Kubernetes resources in AKS. The complete source code for this application can be downloaded from GitHub
When developing and troubleshooting components deployed in Kubernetes cluster, POD logs can be viewed by running command kubectl logs {POD_NAME}
Adding -f argument will let you view stream and command is kubectl logs {POD_NAME} -f
Lastly, --previous argument will let you view logs for a previous instantiation and command is kubectl logs {POD_NAME} --previous
The command arguments listed above are valid for viewing logs for a specific container within a POD however you will need to specify container name by adding -c argument and command is kubectl logs {POD_NAME} -c {CONTAINER_NAME}
Hope this tip helps you in troubleshooting components deployed in Kubernetes cluster.
This is the next part of the series on developing and deploying
Angular, ASP.NET Core Web API and SQL Server to Azure Kubernetes Service
Function Apps using Azure Functions 2.0 runtime
In this article, I am going to share steps needed to deploy SonarQube to Azure Kubernetes Service cluster and integrate with Azure DevOps pipeline to setup code analysis for Angular and ASP.NET Core web apps created in previous parts of this series. The previous articles of this series are
The tools used to develop these components are Visual Studio for Mac/VS Code/VS 2017, AKS Dashboard, Docker for Desktop and kubectl.
SonarQube
SonarQube provides the capability to not only show health of an application but also to highlight issues newly introduced. I am going to configure SQL Server as backend database for SonarQube.
This is the next part of the series on developing and deploying
Angular, ASP.NET Core Web API and SQL Server to Azure Kubernetes Service
Function Apps using Azure Functions 2.0 runtime
In this article I am going to go through steps needed to add real-time web functionality to Angular App using ASP.NET Core SignalR and Azure SignalR Service bindings for Azure Functions 2.0. The specific topics which this article is going to cover are
Add ASP.NET Core SignalR to ASP.NET Core 2.1 Web API
ASP.NET Core SignalR
ASP.NET Core SignalR scale out using
Azure SignalR Service backplane
Redis Cache backplane
Publish/Subscribe messages to SignalR Hub from Angular App
Publish/Subscribe messages to SignalR Hub using Azure SignalR Service bindings for Azure Functions 2.0 from Angular App
Build Docker images and deploy to Azure Kubernetes Service
This article is second part of the series on Deploying Angular, ASP.NET Core and SQL Server on Linux to Azure Kubernetes Service (AKS) cluster. The first part, describes steps needed to deploy these components to AKS. App configuration in ASP.NET Core is based on key-value pairs established by configuration providers. Configuration providers read configuration data into key-value pairs from a variety of configuration sources. In this article I am going to share multiple ways to load App configuration in ASP.net Core Web API
Hosting Environment specific appsettings.json
Dockerfile Environment Variables
Kubernetes
Container Environment variables with data from ConfigMap/Secret
Populate Volume (Config file) with data stored in a ConfigMap/Secret
Azure Key Vault Secrets
The tools used to develop these components are Visual Studio for Mac/VS Code/VS 2017, AKS Dashboard, Docker for Desktop and kubectl. The formatting of code snippets in this article may get distorted (especially yaml), thus please refer to GitHub repository for complete source code for this article.
This is second part of the series on deploying Elasticsearch, Logstash and Kibana (ELK) to Azure Kubernetes Service cluster. In this article I am going to share steps needed to enable Azure AD SAML based single sign on to secure Elasticsearch and Kibana hosted in AKS. I will also go through steps needed to secure communications in ELK cluster. The first part describes steps needed to deploy ELK to AKS and consume messages from Azure Event Hub
Using SAML SSO for Elasticsearch with AAD means that Elasticsearch does not need to be seeded with any user accounts from the directory. Instead, Elasticsearch is able to rely on the claims sent within a SAML token in response to successful authentication to determine identity and privileges. I have referred to this article to enable SAML based single sign on for Elasticsearch.
Kibana, as the user facing component, interacts with the user’s browser and receives all the SAML messages that the Azure AD sends to the Elastic Stack Service Provider. Elasticsearch implements most of the functionality a SAML Service Provider needs. It holds all SAML related configuration in the form of an authentication realm and it also generates all SAML messages required and passes them to Kibana to be relayed to the user’s browser. It finally consumes all SAML Responses that Kibana relays to it, verifies them, extracts the necessary authentication information and creates the internal authentication tokens based on that. The component diagram has been updated to add Azure AD SAML based SSO integration.
The dev tools used to develop these components are Visual Studio for Mac/VS Code, AKS Dashboard, kubectl, bash and openssl. The code snippets in this article are mostly yaml snippets and are included for reference only as formatting may get distorted thus please refer to GitHub repository for formatted resources.
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. You can read about Azure Functions 2.0 general availability @ Introducing Azure Functions 2.0. Runtime 2.0 runs on .NET Core 2, which means it can run on all platforms supported by .NET Core, including macOS and Linux. This enables cross-platform development and hosting scenarios.
In this article I am going to
Create Azure Functions triggered by Azure Blob storage and Event hub in Visual Studio Code
Debug locally in Visual Studio Code
Deploy Azure Functions to Azure Kubernetes Service
Dev tools used to develop these components are Visual Studio Code for macOS, Docker, AKS Dashboard and kubectl.