Posts

GitLab - trigger keyword

Image
 We are going to talk about the trigger keyword today. With this feature you can define a downstream pipeline trigger. So, you can trigger a pipeline in any project (you must have access to this project of course). There are two types of downstream pipelines: multi-project pipelines child pipelines  Let's omit child pipelines and have a look at multi-project pipelines. It is very easy to use. You just need to provide a path to the project and that's it. Remember that you only can trigger a pipeline, not a job ! You can also provide more information like environment variables. As always, I am going to provide some examples, so you will understand it better. The first example: you want to trigger a pipeline in another project after the deployment of your service/s. How to do that? Here we want to trigger this pipeline (project my/run-tests ). Let's say that we want to run some tests after each deployment: We need to add the job which will trigger this pipeline in my/run-...

You have reached your pull rate limit!

Ahh, yes!  You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit I bet you had encountered this problem if you read now this post. According to this document : Beginning November 2, 2020, progressive enforcement of rate limits for anonymous and authenticated Docker Hub usage goes into effect. This means that anonymous and free Docker Hub users will have usage restrictions gradually placed on container image pull requests. Sadly, this happened on our clusters (AWS EKS). How to fix it? I wanted to spawn DaemonSet object where I could run docker login command and this way change config.json  on every node. But, after that you need to restart docker  process on every node, and I still do not know how to do that on AWS EKS. So, a temporary fix was to create a Secret object and then link it to every ServiceAccount  object. It is "a hack", but we needed very fast working solution. We ...

GitLab - rules keyword

Image
Today we are going to learn more about the rules  keyword in GitLab. Normally, when you want to spawn your job on a specific branch or only in merge requests you can just use the only/except keywords. But, sometimes it is not enough. For example, what if you want to spawn a job always on develop  branch and run it automatically, but in merge requests you want to run it manually? You cannot (easily) do that with the only/except  keywords. Excerpt from GitLab documentation about the rules  keyword: The rules keyword can be used to include or exclude jobs in pipelines.  Rules are evaluated in order until the first match. When matched, the job is either included or excluded from the pipeline, depending on the configuration. If included, the job also has certain attributes added to it.  rules replaces only/except and can’t be used in conjunction with it. If you attempt to use both keywords in the same job, the linter returns a key may not be used with rules ...

GitLab - extends keyword

Image
In the previous post I had written about the include keyword. Now, let's dive in the extends  keyword. What can you do with this? Here is the definition from GitLab documentation: extends defines entry names that a job that uses extends inherits from. It’s an alternative to using YAML anchors and is a little more flexible and readable. So, when is it useful? It is useful when you want to be DRY and keep your setup cleanly. Let's assume that you want to build a docker image and give developers possibility to deploy the image on dev , staging and prod environments. We are going to do that without the extends keyword, and after that we will think how we can do it better. Example without the extends  keyword: As you can see, there is a lot of code which repeats itself: stage, image, when  and script . We can remove a lot of code! So, let's do that. We are going to create a hidden job .deploy  (you will not see this job in a pipeline) and the rest of the jobs ...

GitLab - include keyword

Image
Some time ago GitLab has introduced include keyword that: allows the inclusion of external YAML files. Everyone knows that microservices like to multiply. Thus, this feature is very useful when you have microservices, because you do not have to copy & paste the code. Let's see what can you do with this. Let's assume that we use Docker and we need to build an image. If you want to build an image and deploy your microservice you can just make a new .gitlab-ci.yml file in your project and add two jobs: build and  deploy : When you have three/four microservices you can tempt to copy & paste this code to the rest of the projects. But, if you have more projects and you need to change the logic of deployments, then you have a problem, because you need to edit many .gitlab-ci.yml files. You do not want to do that, so how to deal with it? Use include  keyword! It is very simple. For me in most cases creating one gitlab template file is enough. Let's call it git...

Bug in (datetime|time).strptime - AttributeError: _strptime

 Issue 7980 This bug occurs only when you use threads and only once. Of course, this method is thread safe, but there is a severe warning. The first use of strptime is not thread secure (underneath _strptime is imported, and the import may throw AttributeError ). If you want to avoid this problem, either you have to call strptime or import _strptime before starting a thread. How to fix it? Just import _strptime or call strptime before starting a thread. It seems that it happens since 2010.

Py.test - Splitting conftest file

If you have to maintain a massive project, you probably have many fixtures in conftest file. And there is a problem; this file grows and grows. So, at some point, you decide to split this huge file into smaller files. But, py.test has to know the fixtures are keeping in these files. So, what you can do? I see three patterns here for this: import these guys inside conftest.py file create more conftest.py files use pytest_plugins Import fixtures inside conftest file You can for example do this: 1 2 3 4 5 6 7 8 9 from tests.my_fixtures.fixs1 import ( fix11, fix12, fix13, fix14, fix15, ... ) from tests.my_fixtures.fixs2 import ( fix21, fix22, fix23, ) from tests.my_fixtures.fixs3 import ( fix31, fix32, fix33 ) But, there are some problems with this approach. You import them in an explicit way, so if you create a new fixture, you have to remember to add this guy in the import clause. So, this strategy is not the best solution. If you have many fi...