No Description
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Maximilian Korndörfer 7e27ed19c5 set default distance of empty questionnaires to 0 instead of 1.0 2 years ago
ypc_algorithm set default distance of empty questionnaires to 0 instead of 1.0 2 years ago
.gitignore change experience time calculation 3 years ago
README.md Some doc work 2 years ago
bitbucket-pipelines.yml another image fix 3 years ago
docker-cloud.yml Integrate RabbitMq on stage 4 years ago
docker-compose.yml remove memcache from setup 2 years ago

README.md

YPC Algorithm REST API

This REST API was build to accept pushes of young professionals and jobs from the jobify website.

It uses MongoDB as Datastorage.

It communicates with the matching workers using RabbitMq message queuing.

Architecture

The Python architecture divides into three parts.

  1. The data model that the other components are using.
  2. The REST API. It receives the Starhunter Updates, Pushes them into the MongoDB and creates Tasks for the Worker.
  3. The Worker. It consumes Tasks and pushes the results back to the Starhunter. Every change of an entity leads to one Task being created and one push back to the starhunter.

As communication between Rest-API and Workers a Message Queue Service is used. On Python side we use Celery and the Server is RabbitMQ.

Working environment

The api runs in a separate environment provided by a Docker image. First of all you need to install docker and docker-compose.

Debugging

PyCharm Debugger can be attached to Docker Containers. This is very useful for debugging.

Please make yourself familiar with this method before going on and create separate run profiles for running the worker and the rest api with the Docker Compose context.

API

The API part can be run without bigger problems local or in the docker container. Long waiting times most likely indicate that the database can not be connected.

Worker

The worker can be a little bit tricky to debug.

However, there are some possibilities to do this:

Run a test worker in pure python

You can run the flask api in the docker container and have your local python installation running the python worker_test_runner.py. Please edit the file for your current debug action (e.g. which yp or cp to process).

Run a test worker with celery

You can run the worker with celery.

This can be done in the container (it happens when you run docker-compose up). This is useful when you want to have a test calculation based on some changes.

You can also run the worker with celery locally. This is rather complicated to achieve and I would not recommend this unless you smell some trouble in the celery multitasking or the worker data exchange.

You can also attach a worker (with attached Debug Console) to the prod system. Just provide the relevant environmental variables and a tunnel to the MongoDB and the RabbitMQ server.

Get it up & running

Just type

docker-compose up

That’s it!

Get test data