No Description
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long. 2.4KB

YPC Algorithm REST API

This REST API was build to accept pushes of young professionals and jobs from the jobify website.

It uses MongoDB as Datastorage.

It communicates with the matching workers using RabbitMq message queuing.


The Python architecture divides into three parts.

  1. The data model that the other components are using.
  2. The REST API. It receives the Starhunter Updates, Pushes them into the MongoDB and creates Tasks for the Worker.
  3. The Worker. It consumes Tasks and pushes the results back to the Starhunter. Every change of an entity leads to one Task being created and one push back to the starhunter.

As communication between Rest-API and Workers a Message Queue Service is used. On Python side we use Celery and the Server is RabbitMQ.

Working environment

The api runs in a separate environment provided by a Docker image. First of all you need to install docker and docker-compose.


PyCharm Debugger can be attached to Docker Containers. This is very useful for debugging.

Please make yourself familiar with this method before going on and create separate run profiles for running the worker and the rest api with the Docker Compose context.


The API part can be run without bigger problems local or in the docker container. Long waiting times most likely indicate that the database can not be connected.


The worker can be a little bit tricky to debug.

However, there are some possibilities to do this:

Run a test worker in pure python

You can run the flask api in the docker container and have your local python installation running the python Please edit the file for your current debug action (e.g. which yp or cp to process).

Run a test worker with celery

You can run the worker with celery.

This can be done in the container (it happens when you run docker-compose up). This is useful when you want to have a test calculation based on some changes.

You can also run the worker with celery locally. This is rather complicated to achieve and I would not recommend this unless you smell some trouble in the celery multitasking or the worker data exchange.

You can also attach a worker (with attached Debug Console) to the prod system. Just provide the relevant environmental variables and a tunnel to the MongoDB and the RabbitMQ server.

Get it up & running

Just type

docker-compose up

That’s it!

Get test data