Symmetric denial of service - Python

Symmetric denial of service - Python

Need

Prevention of service outages caused by repeated resource-intensive requests

Context

  • Usage of Python 3.0+ for developing Python applications
  • Usage of Django for building web applications in Python

Description

Non compliant code

        from django.http import JsonResponse
from time import sleep

def heavy_process(request):
    # Simulate a heavy process
    sleep(100)
    return JsonResponse({"status": "done"})
        
        

The above function heavy_process simulates a heavy process that takes a lot of time to complete (in this case, it sleeps for 100 seconds). This is a Django view function that responds to a HTTP request.

The vulnerability lies in the fact that this function consumes a lot of resources (time, in this case) for each request it handles. If an attacker repeatedly sends requests to this endpoint, it can consume all of the server's resources, causing a Denial of Service (DoS) attack. This means the server would become unresponsive to other legitimate requests because it is too busy handling the repeated requests from the attacker.

This is an example of a Symmetric Denial of Service vulnerability because the attacker does not need a lot of resources to perform the attack. They simply need to keep sending requests to this endpoint.

Steps

  • Implement rate limiting to prevent a user from making too many requests in a short period of time.
  • Use asynchronous processing for heavy tasks to prevent blocking the main thread.
  • Use a queue system to manage heavy tasks and process them in the background.
  • Implement a timeout for requests to prevent long running processes from blocking the server.
  • Optimize the heavy process to reduce its execution time and resource consumption.

Compliant code

        from django.http import JsonResponse
from time import sleep
from django.core.cache import cache
from django.views.decorators.cache import cache_page
from django.views.decorators.vary import vary_on_cookie
from concurrent.futures import ThreadPoolExecutor, TimeoutError

# Set up a ThreadPoolExecutor for asynchronous processing
executor = ThreadPoolExecutor(max_workers=5)

def rate_limited(request):
    # Get the IP address of the client
    ip = request.META.get('REMOTE_ADDR')

    # Check if the IP address is in the cache
    if cache.get(ip):
        return JsonResponse({"status": "Too many requests"}, status=429)

    # Add the IP address to the cache with an expiry time of 60 seconds
    cache.set(ip, True, 60)

    return heavy_process(request)

@vary_on_cookie
@cache_page(60 * 15)  # Cache the result for 15 minutes
def heavy_process(request):
    try:
        # Run the heavy process in the thread pool with a timeout of 5 seconds
        result = executor.submit(sleep, 100).result(timeout=5)
    except TimeoutError:
        return JsonResponse({"status": "Request timed out"}, status=408)

    return JsonResponse({"status": "done"})
        
        

The code above fixes the denial of service vulnerability by implementing several measures:

1. Rate Limiting: This is done in the rate_limited function. It uses Django's caching framework to store the IP addresses of clients. If a client makes a request and their IP address is in the cache, the function returns a 429 status code (Too Many Requests) and does not process the request. If the IP address is not in the cache, it is added with an expiry time of 60 seconds.

2. Asynchronous Processing: The heavy_process function is modified to run the heavy process in a separate thread using a ThreadPoolExecutor. This prevents the main thread from being blocked by the heavy process.

3. Timeout: The ThreadPoolExecutor's submit method is used with a timeout of 5 seconds. If the heavy process takes longer than this, a TimeoutError is raised and the function returns a 408 status code (Request Timeout).

4. Caching: The heavy_process function is decorated with Django's cache_page decorator, which caches the result of the function for 15 minutes. This means that if the same request is made within this time period, the heavy process does not need to be run again.

5. Vary on Cookie: The vary_on_cookie decorator is used to ensure that the caching works correctly when cookies are used. This decorator makes sure that a separate version of the page is cached for each unique set of cookies.

These measures together help to prevent a denial of service attack by limiting the rate of requests, preventing long-running processes from blocking the server, and reducing the need to run the heavy process by caching its result.

References