Modern Java offers two fundamentally different approaches to handling concurrency:
At first glance, both aim to solve scalability and high-concurrency problems. However, they operate at entirely different abstraction levels and are optimized for different trade-offs.
This article provides:
Target audience: Beginners → Senior Engineers → Architects
| Aspect | RxJava (Reactive) | Virtual Threads |
|---|---|---|
| Model | Asynchronous, event-driven | Synchronous style |
| Thread usage | Few threads, many tasks | Many lightweight threads |
| State handling | Propagated through pipeline | Stack-based |
| Backpressure | Built-in | Not inherent |
| Debuggability | Harder | Easier |
| Cognitive load | Higher | Lower |
Client → Publisher → Operator → Operator → Subscriber
Execution is callback-driven and asynchronous.
Client → Virtual Thread → Blocking Code → Result
Execution appears sequential.
import io.reactivex.rxjava3.core.Observable;
import io.reactivex.rxjava3.schedulers.Schedulers;
public class RxJavaExample {
public static void main(String[] args) throws InterruptedException {
Observable.fromCallable(() -> fetchUser())
.subscribeOn(Schedulers.io())
.map(user -> enrichUser(user))
.observeOn(Schedulers.computation())
.subscribe(
result -> System.out.println("Result: " + result),
error -> error.printStackTrace()
);
Thread.sleep(2000); // Keep JVM alive
}
static String fetchUser() {
sleep(500);
return "User";
}
static String enrichUser(String user) {
return user + " Enriched";
}
static void sleep(long ms) {
try { Thread.sleep(ms); } catch (InterruptedException ignored) {}
}
}
fromCallable() wraps a blocking method.subscribeOn(Schedulers.io()) schedules upstream execution on IO thread pool.map() transforms emitted value.observeOn() switches downstream execution context.subscribe() triggers pipeline execution.Important:
subscribe() is called. subscribe()
|
v
+------------------+
| IO Scheduler |
| (Thread Pool) |
+------------------+
|
fetchUser()
|
v
map()
|
+------------------+
| Computation Pool |
+------------------+
|
Subscriber
flowchart LR A[Observable] --> B[subscribeOn IO] B --> C[map] C --> D[observeOn Computation] D --> E[Subscriber]
import java.util.concurrent.Executors;
public class VirtualThreadExample {
public static void main(String[] args) throws Exception {
try (var executor = Executors.newVirtualThreadPerTaskExecutor()) {
for (int i = 0; i < 5; i++) {
executor.submit(() -> {
String user = fetchUser();
String enriched = enrichUser(user);
System.out.println(enriched);
});
}
}
}
static String fetchUser() throws InterruptedException {
Thread.sleep(500);
return "User";
}
static String enrichUser(String user) {
return user + " Enriched";
}
}
Key points:
Thread.sleep) do not block OS threads.Internally:
Virtual Thread → Park → Carrier Thread Released
This enables 100k+ concurrent blocking tasks.
1000 Virtual Threads
|
v
+------------------+
| JVM Scheduler |
+------------------+
|
v
8 Carrier Threads (OS)
|
v
CPU Cores
flowchart TD A[Virtual Threads] --> B[JVM Scheduler] B --> C[Carrier Threads] C --> D[CPU Cores]
Event → Stream → Transform → Transform → Terminal Operation
State is carried inside pipeline.
Request → Thread → Blocking I/O → Return
State is maintained in stack.
State is immutable and passed between operators:
.map(user -> user + " enriched")
Each operator receives and emits a new state.
Advantages:
Disadvantages:
State is stack-local:
String user = fetchUser();
String enriched = enrichUser(user);
Advantages:
Bad for:
Memory footprint:
Typical solution:
Hooks.onOperatorDebug();
But this adds overhead.
Trace example:
try {
service.call();
} catch (Exception e) {
log.error("Failed", e);
}
No reactive context loss.
| Scenario | Recommended |
|---|---|
| REST API with blocking DB | Virtual Threads |
| Streaming data pipeline | RxJava |
| Legacy monolith modernization | Virtual Threads |
| High-frequency event processing | RxJava |
| Simple microservices | Virtual Threads |
They are not competitors at the same layer.
Reactive programming is a data flow paradigm. Virtual threads are a thread scheduling mechanism.
They can coexist:
If you are building:
→ Default to Virtual Threads
→ Consider Reactive
→ Virtual Threads
→ Reactive
Reactive programming optimized for throughput efficiency.
Virtual threads optimize for developer productivity and simplicity.
In 2026, the strategic shift is clear:
Prefer structured concurrency and virtual threads unless you truly need reactive streaming semantics.
Both models remain powerful — but they solve different layers of the concurrency problem.