Building Kibana Dashboards to Visualize Our Application Part 2
Improving logging in a Python web application for better insights and structured logs in Kibana.
Hello and welcome back! In the previous lesson, we successfully deployed our Login App and visualized its logs in Kibana. However, we noticed that the logs generated by our application were not very informative. Today, we will improve our logging mechanism to produce more structured and insightful logs.
Let’s start by examining how logs are generated in our existing application. In the Python web application’s repository, open the Dockerfile to observe that the entire working directory is being copied. Then, navigate to the application’s entry point by opening the app.py file.In app.py, we import Python’s built-in logging module to send structured logs. Below is the original logging configuration:
Copy
Ask AI
import loggingimport refrom flask import Flask, render_template, request, redirect, url_for, flashapp = Flask(__name__)app.secret_key = 'your_secret_key' # Replace with a real secret key in production# Configure logginglogging.basicConfig(level=logging.INFO)logger = logging.getLogger(__name__)# Default credentialsUSERNAME = 'admin'PASSWORD = 'password'def is_weak_password(password): if len(password) < 8: return True if not re.search("[a-zA-Z]", password) or not re.search("[0-9]", password): return True return False@app.before_requestdef log_request_info(): logger.info("Request method: %s", request.method) logger.info("User Agent: %s", request.user_agent) logger.info("Client IP: %s", request.remote_addr)@app.after_requestdef log_response_info(response): logger.info("Response status: %s", response.status) return response@app.route('/', methods=['POST'])def login(): return render_template('login.html')
In this implementation, logger.info is used to send logs, which then appear in the Kubernetes pod logs.
Improving the Logging Structure with a Custom JSON Formatter
To deliver more structured and insightful logs, we updated our application in a file named update_app.py. In this updated version, we continue to use the logging module and introduce a custom JSON formatter. This formatter creates log entries in JSON format with detailed key-value pairs, making it easier for tools like Elasticsearch and Kibana to parse and analyze the logs.Below is the updated code snippet from update_app.py:
With the improved logging mechanism in place, the next step is to redeploy your application to the Kubernetes cluster and observe the new logs. This updated structure should provide better insights and more actionable data compared to the previous setup.
Ensure to update your Kubernetes manifests appropriately and monitor the logs in real-time after redeployment to confirm that the new changes are taking effect.
That concludes this lesson. In the next session, we will dive deeper into analyzing these structured logs and building effective Kibana dashboards. Thank you for joining, and see you in the next lesson!