Articles
Loading an Neo4J 3.X database in a Neo4J 4.X instance
Importing data from a Neo4J v3 database into a v4 one can be a hassle. Here are the steps to achieve the migration.
3D printed bracket for fume extractor
I got myself a fume extractor but it takes a rather large amount of real-estate on my desk. I decided to mount it on an arm so that I could easily move it out of the way so I designed and 3D printed an attachment bracket for it.
RESTful API design
There are four main kind of operations when working with a database: Create, Read, Update and Delete (CRUD). However, databases are usually not exposed directly to the clients. Instead, those operations are performed by a server side application with a client-facing API. The most common type of APIs nowadays are based on HTTP. An HTTP API can be built with complete freedom. However, guidelines for best-practice HTTP API design have been created. An API following those guidlines is called a REST (or RESTful) API.
Self-hosted GitLab instance for DevOps on Ubuntu 18.04
GitLab provides a great number of tools needed for the DevOps cycle of an application. In this guide, we'll install a GitLab instance on our own server and configure it to fit our DevOps needs. Here, we will use a fresh install of Ubuntu 18.04 as a base.
Deploying a TensorFlow model on a Jetson Nano using TensorFlow serving and K3s
The Nvidia Jetson Nano constitutes a low cost platform for AI applications, ideal for edge computing.However, due to the architecture of its CPU, deploying applications to the SBC can be challenging. In this guide, we'll install and configure K3s, a lightweight kubernetes distribution made specifically for edge devices. Once done we'll build and deploy an TensorFlow model in the K3s cluster.
Containerization of a Flask application
Flask can be seen as the equivalent of Express for Python. However, although an express application is basically production ready, a Flask app outputs the following when executed by itself:
Deployment of a TensorFlow model to Kubernetes
Let’s imagine that you’ve just finished training your new TensorFlow model and want to start using it in your application(s). One obvious way to do so is to simply import it in the source code of every application that uses it. However, it might be more versatile to keep your model in one place as standalone and simply have applications exchange data with it through API calls. This article will go through the steps of building such a system and deploy the result to Kubernetes.
Node.js DevOps example
In this article, we’ll build a simple Node.js application that uses Express to respond to HTTP requests. In order to deploy this application to production, we’ll also configure a GitLab CI/CD pipeline so as to dockerize it and deploy its container to a Kubernetes cluster.
GitLab CI Microk8s integration
GitLab provides Kubernetes integration out of the box, which means that GitLab CI/CD Pipelines can be used to deploy applications in Kubernetes easily. This guide presents how to integrate a Kubernetes cluster in a GitLab Project and follows Gitlab documentation. For this particular case, the cluster will be that of a Microk8s Kubernetes distribution.
Application containerization
Let's imagine a developer building an application on his computer and that this application is meant to be deployed on a different machine (production environment). In order to execute properly, this application requires multiple libraries, binaries and packages. For example, a Python program requires the Python interpreter as well as all the imported Python modules.