Over the weekend I was working on a big data platform & during the POC I found that in big data platform libraries are very important. If you used different version things will not work as expected.
This blog will help you to integrate apache spark with Azure blob storage as a data lake.
For this activity we need followings
- Azure Account(Blob Storage)
- Linux Instance
Let’s start with Azure Blob Storage. For that, you need an Azure account and create a storage account like below
This blog will help you to integrate Jenkins with ECR efficiently and secure way. So for that, we need an AWS account, ECR(Elastic Container Repository ), Jenkins, and IAM Role.
Following are ways to push and pull a docker image from ECR in Jenkins
So when we using AWS CLI or AWS Role we used to run the below command to login into the ECR repo.
aws ecr get-login
This will generate a token using AWS role or Credentials which valid for 12 hours. …
With time innovation is advancing each hour simultaneously troubleshooting turns out to be really difficult. At the point when we are running microservices in kubernetes its turns out to be more unpredictable to discover the course cause why things are coming up short. Today we are talking about how easily we can troubleshoot issues in kubernetes.
Assume we have deployed an application in kubernetes and in a specific span, we are getting 502 blunder code or when we do the deployments, restart Pods some requests are failing.
To start the troubleshooting we have to identify a point from where we…
This guide will help you configure Bitbucket Pipelines to automatically deploy a containerized application to Kubernetes. We’ll create a deployment in Kubernetes to run multiple instances of our application, then package a new version of our Node.js app in a new version of a Docker image and push this image to DockerHub. Finally, we’ll update our deployment using Pipelines to release the new Docker image without a service outage.
1: Need to push your code into the bitbucket repository.
4: Create CI/CD pipeline by using the yml file.
To build a node.js application we need to…
The majority of businesses now are using Docker to run applications. They devote a lot of time, energy, and resources to stabilize their success and invest heavily in a variety of advanced observation techniques. Despite this, they are experiencing poor performance and the containers are being stressed as a result of the heavy traffic flow. The motive of this blog is to help those running Java applications on docker containers in getting optimal performance.
A lot of times people complain that the application does not seem to be running as well as it did on the server. …
DevOps Engineer with 10+ years of experience in the IT Industry. In-depth experience in building highly complex, scalable, secure and distributed systems.