If you are coming from the Jenkins world, you may be wondering whether GitLab pipelines can accept parameters. They can and I’ll demonstrate below how to define and run parameterized pipelines in GitLab. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Software Dependency Updates Using Renovate
Today, I’ll demo how to automate software dependency updates using Renovate tool. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Implement LRU Cache in Python
To deviate a bit from my usual DevOps articles, I’ll show my attempts to implement LRU cache in Python. We’ll see 2 methods: one using Python built-in OrderedDict
class. The second will use custom doubly linked list implementation. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Go Docker CI in GitLab
Today, I’ll demo a sample Go Docker CI pipeline in GitLab. The pipeline will run on a sample Go containerized app. Hence the name Go Docker. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Today, I’ll show how to keep git repository in another git repository. It may sound crazy and unnecessary. Yet, I’ve stumbled once on a need to do that during my career. So, I’ll share the way with the world. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Why Store Git Repository in another Git Repository?
If you came across this article, you probably know why you need to do such a bizarre thing. Yet, for those who wonder why one might need to store a git repository inside another git repository, I’ll provide an example.
Suppose, your app needs to pull some code from a git repository (e.g. some script) and run it. So far, it seems like an easy use case. You test your app on a dev environment. Nicely, it has access to an org git repository hosting. Thus, there’s no issue to place your scripts in some git repository and provide repository url and credentials to your app. Now, suppose your app runs in an air-gapped environment for security reasons. In that case, the app obviously won’t have access to git repository hosting and must use a different solution. You might say, why require git access in the first place. It’s a rightful question for the apps you fully control. Yet, what if you use some 3rd party product which you cannot modify? I stumbled on such a use case while using AWX.
Real world use-case requiring storing Git Repository in another Git Repository
Simply put, AWX is a web app for managing ansible playbooks. Ansible requires an inventory of hosts to run the playbooks on. And such an inventory may come from a dynamic inventory script. AWX pulls this script from a git repository. That’s how it works and I can do nothing about it. Now imagine, AWX running in production air-gapped environment. How would you provide access from AWX to a git repository? Short things short, I came up with the solution of keeping an inventory script inside a bare git repository which resided in the code repository of AWX out of which its deployable artifact was built. Let’s see git commands I used to create bare git repository inside another repository.
How to store Git Repository in another Git Repository?
To achieve that, we’ll need to store the inner repository as files. This way, it can be source controlled just as any other data you want to store in git. Follow below commands to store repository (inner
below) inside another git repository (outer
below):
- Create outer git repository:
mkdir /tmp/outer
cd /tmp/outer
git init
echo "i'm outer" > outer.txt
git add *
git commit -am 'outer'
- Create inner git repository:
mkdir /tmp/inner
cd /tmp/inner
git init
- make sample change inside
inner
:
echo "i'm inner" > inner.text
git add inner.text
git commit -am 'test'
- Copy .git contents of
inner
repo toouter
mkdir /tmp/outer/inner
cp -r /tmp/inner/.git/* /tmp/outer/inner
Now, commit inner
inside outer
repository:
cd /tmp/outer
git add *
git commit -m 'inner inside outer'
That’s it! You can now push outer repository, build a deployable artifact out of it and deploy it to production. Anyone or anything will be able to clone inner
repository to get its contents. Try it locally:
git clone /tmp/outer/inner /tmp/inner_contents
cat /tmp/inner_contents/inner.text
i'm inner
If you followed to this point, then you might really needed this to work as I did 🙂
Summary
That’s it about keeping git repository in another git repository. Feel free to share.
If you found this article useful, take a look at the disclaimer for information on how to thank me.
You can find below articles useful as well:
Today, I’ll show how to run a sample Terraform pipeline on GitLab. The pipeline will include GitLab Terraform CI templates. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Assume you want to clone a private GitLab repository in your GitLab pipeline. Let’s see how to do that. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Why Clone Repositories in GitLab Pipelines?
You probably know why you need that. You have never done it and wonder why you may need it? For example, your GitLab pipeline may run some automation (e.g. patterns replacements, version bumps) on specific repositories. To perform the changes in the repositories, you first need to clone the repository in your pipeline, do some changes, then commit and push it. However first, you need to clone it. While cloning repositories manually is straightforward either using https or ssh url, cloning in GitLab pipelines requires a bit different url. Let’s see how it looks and a demo GitLab pipeline using it.
Clone Private Repositories in GitLab Pipelines Demo
Let’s have a look at a sample GitLab pipeline which clones another private repository. In addition, it makes a sample change, commits and pushes it. Note that cloning private repository requires authentication, while clone public does not. To authenticate you’d need to generate a personal or group access token of at least read_repository
scope. Add the token to the GitLab project or group masked variables.
Below is the pipeline which uses the variable GITLAB_TOKEN
.
stages:
- update_repo
build-job:
stage: update_repo
- git config --global user.email "[email protected]"
- git config --global user.name "Your Name"
script:
- echo "updating repo"
- git clone https://oauth2:[email protected]/[your_user]/[another_project_name].git
- cd [project_name]
- echo 'test' > test.txt
- git commit -am 'test'
- git push
- cd -
Note above that to clone the repository you need to run:
git clone https://oauth2:[email protected]/[your_user]/[another_project_name].git
where GITLAB_TOKEN
is defined as the GitLab project or group masked variable.
Summary
That’s it about using clone private repositories in GitLab pipelines. As always feel free to share. If you found this article useful, take a look at the disclaimer for information on how to thank me.
You can also find below articles useful:
- GitLab Self-Hosted Runners Demo
- Get user’s permissions using kubectl
- Migration from Jenkins to GitLab
- Git Tricks: git commit –amend + git force –push
Recommended GitLab books on Amazon.
Use Ansible Vault in Python
So you want to use secrets stored in Ansible Vault in your Python apps. Let’s see how to do that. If you later find this article useful take a look at the disclaimer for information on how to thank me.
If you are not familiar with Ansible Vault, go over the brief introduction below.
Store secrets in Ansible Vault
So, you might already know that storing secrets in your source code is bad. Yet, your app uses secrets and must store them in source code repository. What can you do? You can opt for a fully fledged secret storage solution (e.g. HashiCorp Vault). Yet, this is an overkill for a simple app using secrets. What can you do? The simplest thing that comes to mind is encrypting the secrets and keeping them encrypted under source control. Ansible Vault allows just that.
Storing secrets in Ansible Vault step by step
- Create
vault.yml
file and add your secrets in yaml format as below:
secret_name1: val1
secret_name2: val2
- Next, create
multi_password_file
. Add the password to the vault to the file and add it to.gitignore
. This is the password which will encrypt the vault. Remember that if you lose this password file, you won’t be able to decrypt your Ansible vault. - Install
ansible
. Installing it will installansible-vault
binary as well and add it to your PATH. - Run
ansible-vault encrypt vault.yml --vault-password-file multi_password_file
to encrypt your vault. Afterwards,vault.yml
will start with$ANSIBLE_VAULT;1.1;AES256
and will contain just numbers. - Run
ansible-vault decrypt vault.yml --vault-password-file multi_password_file
to decrypt the vault. Then you’ll see your secrets in clear text. You can safely commitvault.yml
to source code repository.
Read Ansible Vault in Python
Now, let’s assume you want to use the secrets from Ansible Vault in your Python app or script. How can you read it? You can do that using ansible-vault
package. Then use below Python code for reading the vault:
from pathlib import Path
from ansible_vault import Vault
vault = Vault((Path('multi_password_file').read_text()))
data = vault.load(open('vault.yml').read())
data
is Python dictionary which contains the Ansible vault secrets in clear text which your app/script can use.
Summary
That’s it about using Ansible Vault
in Python. As always feel free to share. If you found this article useful, take a look at the disclaimer for information on how to thank me.
You can also find below articles useful:
- Azure-cli in Dockerfile in Alpine
- Podman Jenkins Agent
- Go Docker CI in GitLab
- GitLab Parameterized Pipelines
Recommended Kubernetes books on Amazon:
Helm charts acceptance tests
I already covered how to test helm charts and different tests you may want to run. Today, I’ll focus on helm charts acceptance tests. If you later find this article useful take a look at the disclaimer for information on how to thank me.
Add Redis Cache to WordPress
Using Redis cache may significantly speed up your web apps. Today we’ll see how to add Redis cache to WordPress. To achieve that I’ll deploy Redis, install PHP Redis client extension and install Redis Object Cache WordPress plugin.
If you later find this article useful take a look at the disclaimer for information on how to thank me.
Why do you need Redis for WordPress
After upgrade to WordPress 6.1 you may get a warning in Site Health tool: “You should use a persistent object cache”. Why do you need it?
Using Persistent Object Cache will speed up page load times by saving on trips to the database from your web server.
WordPress optimization docs
Remember that database queries are one of the most expensive ones. Some queries are performed for each page view. So why to not cache them in RAM? That’s where Redis comes to the rescue. It keeps most frequently used db queries results in RAM. Yet, what will read those queries’ results from Redis and from db if they are not available? Right, we need some backend between the webserver (e.g. Apache) and Redis along with the database. That’s where Redis Object Cache plugin comes into play and provides that backend.
Deploy Redis
You can install and deploy Redis in multiple ways. For example, install and run it as an OS package or using docker and Kubernetes. I’ll deploy containerized Redis, because it’s rather easy and doesn’t conflict with existing OS packages. The only OS packages you need are either docker or podman and their dependencies. I’ll use podman which is a deamonless alternative to docker. podman CLI is the same as docker’s, so you can use the same docker commands. Just replace the word docker
with podman
:
podman run --name redis -p 6379:6379 -d docker.io/redis
This method assumes you run WordPress not in a container, but rather as apache web app directly on your VPS (e.g. on Linode). For instance, if you deployed WordPress as a marketplace app. If you run WordPress in a container refer to the below way for deploying Redis.
To check your Redis is running and healthy enter its container and ping it:
podman exec -it redis bash
# redis-cli
127.0.0.1:6379> ping
PONG
If you rather prefer using a managed Redis solution, consider using Linode’s Redis marketplace app. Linode is a cloud provider recently purchased by Akamai. With this purchase, Akamai became a competitor in the cloud providers market.
Install Redis client php extension
Installing Redis client php extension might be optional. You may skip it and do that only if you discover that Redis Object Cache plugin is not working.
If you still need to install the client you can install
phpredis or other supported extensions like predis.
Install Redis Object Cache plugin
You need Redis Object Cache plugin because it checks first whether the required data from WordPress DB is present in Redis cache. If it does, it reads it from Redis, otherwise queries the database. The plugin is basically a persistent object cache backend for WordPress. I’ll use composer and wp-cli
for installation of the plugin and inspecting its status.
Configure WordPress to use Redis
If you use Bedrock WordPress setup, add to your application.php
2 below commands:
Config::define( 'WP_REDIS_HOST', '127.0.0.1');
Config::define( 'WP_REDIS_PORT', 6379 );
Add Redis cache to WordPress in Docker
If your WordPress setup is containerized e.g. in docker-compose stack, you can add Redis in as an additional service:
redis:
image: redis
container_name: '${COMPOSE_PROJECT_NAME}-redis'
restart: 'always'
expose:
- 6379
and raise it using docker-compose up -d redis
.
In that case Config::define( 'WP_REDIS_HOST', '127.0.0.1');
will have to change to Config::define( 'WP_REDIS_HOST', '${COMPOSE_PROJECT_NAME}-redis');
. In addition you’ll have to add COMPOSE_PROJECT_NAME
variable to .env
file. Of course, the above steps assume you use the Bedrock WordPress setup.
Summary
That’s it about adding Redis cache to WordPress. Feel free to share this article.
If you found this article useful, take a look at the disclaimer for information on how to thank me.
You may find interesting below articles I wrote:
- Adding Google analytics to WordPress website
- Create WordPress site using Docker fast
- Point host name to WordPress in Docker
- Pointing hostname to WordPress using Kubernetes ingress
Find out recommended Redis books on Amazon.