Jenkins and Nexus.

This post will help you to manage and work with Jenkins to build, deploy and test your developments. Deploy your artifacts to Nexus from Jenkins.

Today containers are taking an important part in the IT world. The first time i used is Jenkins, docker did not exist and worskspaces to build projects were present inside the Jenkins host. This is why there will be today different version of Jenkins. One that will use Jenkins with a basic linux docker and another topic later with Jenkins wrapped with dind (Docker in docker). Dind alows Jenkins to create a ephemeral container that will build projects and disapear at the end of the run.

Containerizing Jenkins and Nexus.

Kafka

In this post, i will explain many different topics dedicated to Kafka/Zookeeper/Kraft

Seen that kafka is a tool i like a lot. Let me share with you a video in french explaining many different topics about Kafka such as: What is Kafka, how it works in cluster, how to install it, how to secure it, how to use it with Java, with Apache Camle, with Apache Sparks, how to send logs of Kafka and Zookeeper to ELK with filebeat, how to monitor it with Zabbix, which patterns can be used, what are the best practices.

The video is quite long, but i really enjoyed doing it.

Coming soon:

Kafka pattern usage.

Kafka and security.

Kafka and Apache Camel.

Message Validation, enrichment.

KSQL.

Kafa Connect.

Monitoring with Zabbix/Nagios.

Kafka/Zookeeper and ELK.

ELK

This post is intended to share a tool a find very usefull to store data but especially logs.

  • ELK is a product containing 3 tools:
    • E : Elasticsearch
      • Big Data based on Apache lucene engine storing only json data.
      • Elasticsearch contains an API very useful to communicate with.
    • L: Logstash
      • Powerfull data gateway allowing to forward data to elasticsearch.
      • It is also an ETL that can process changes from received data before forwarding them to Elastic.
    • K: Kibana
      • Kibana is the front end part allowing:
        • To visualize data stored into elasticsearch.
        • Create dashboards.
        • Create alerts.
        • Work for developers to post data by apis request to elasticsearch.
        • To visualize metrics.
        • Manage the stack (index, policies, pipelines, roles, users, …).

Difference between docker swarm, kubernetes and openshift

In this post, i will describe the difference between docker swarm, kubernetes and openshift:

Docker SwarmKubernetesOpenshift
Easy and fast setupOpen source and modularManages kubernetes
Works with other existing docker toolsRuns on any Linux OSHelps abstract kubernetes limitation such as network features
Lightweight installationEasy service organisation with podsRed hat OS (except for dev) is mandatory
Open sourceBacked by years of expert experienceSecurity feature are better managed than in kubernetes
Limited functionalities with the Docker APILaborious install and configurationSmaller community than Kubernetes one
Updates for swarm must be schedulesUpdates for kubernetes are monitored and controlled progressivelyOpenshift lets kubernetes handel the updates
Limited fault toleranceIncompatible with docker cli and docker compose

Security

For security matters, ansible has a feature called ansible vault to store sensite data.

Seen that ansible is an infra as code technology, you need to store the code into a content management sevrice such as CSV, SVN, GIT, TFS, …

So to not allow anyone to read sensite data, use ansible vault

Secure the content of playbooks.

  • Create and Keep sensitive data encrypted with AES:
    • Run the command line ansible-vault create secret-info.yml
      • Enter twice a vault password
      • Enter your sensitive data with the text editor
  • Edit the vault:
    • ansible-vault edit secret-info.yml
    • Edit your sensitive data with the text editor
  • Use the vault:
    • Add vars_files into your playbook
      • vars_files:
      •  – secret-info.yml
    • ansible-playbook playbook.yml –ask-vault-pass
      • It will prompt the vault password
      • If you try to automate the runs, it could be a good idea to request the password from a secured tool such as Hashicorp vault.