Skip to main content
  1. Java Concurrency (java.util.concurrent)/

SynchronousQueue

2 mins

SynchronousQueue is a unique BlockingQueue implementation that has zero capacity. It is a meeting point between a producer and a consumer.

Source Code #

View Source on GitHub

Mechanism: The “Direct Handoff” #

In a SynchronousQueue, a producer cannot insert an element unless a consumer is already waiting to receive it. Similarly, a consumer cannot remove an element unless a producer is already waiting to provide it. This is known as a direct handoff.

// In SynchronousQueue, these return false
queue.offer(e); // If no consumer is waiting
queue.poll();   // If no producer is waiting

// These block until a match is found
queue.put(e);
queue.take();

Internally, SynchronousQueue uses a dual-stack or dual-queue structure (based on whether fairness is used). When a thread arrives and doesn’t find a match, it enqueues a “node” representing its request and blocks. When a matching thread arrives, it fulfills the request and both threads proceed.

Canonical Usage #

When to use: Use SynchronousQueue when you want to minimize the latency of passing a task from a producer to a consumer and you don’t need the overhead of an intermediate buffer. It is the default queue for Executors.newCachedThreadPool().

Common Patterns:

  • Cached Thread Pool: It is ideal for “work-stealing” or “work-offloading” where tasks should be processed immediately by an available thread.
  • Direct Request Processing: Use it when the producer shouldn’t “buffer” work but should instead be forced to wait for a consumer to be ready.
// Zero-capacity handoff queue
BlockingQueue<Integer> handoff = new SynchronousQueue<>();

// Producer (blocks until someone takes)
executor.execute(() -> {
    try {
        handoff.put(200);
    } catch (InterruptedException e) {
        Thread.currentThread().interrupt();
    }
});

// Consumer (blocks until someone puts)
executor.execute(() -> {
    try {
        Integer val = handoff.take();
    } catch (InterruptedException e) {
        Thread.currentThread().interrupt();
    }
});

Performance Trade-offs #

  • Pros: Extremely low latency for passing data. No intermediate memory buffer.
  • Cons: High contention on the head of the dual-stack/queue under very high thread counts. It offers no “smoothing” for bursty traffic.