Reactive Programming with RxJava for Efficient Data Access – Couchbase Connect 2014

77
Reactive Programming with RxJava for Efficient Data Access Ben Christensen | Software Engineer, Netflix Michael Nitschinger | Software Engineer, Couchbase

Transcript of Reactive Programming with RxJava for Efficient Data Access – Couchbase Connect 2014

FileNewTemplate

Reactive Programming with RxJavafor Efficient Data AccessBen Christensen | Software Engineer, Netflix Michael Nitschinger | Software Engineer, Couchbase

1

RxJavaReactive Extensions for Async Programming

Reactive Extensions for Async Programming

RxJavahttp://github.com/ReactiveX/RxJavahttp://reactivex.ioSingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.create(subscriber -> { subscriber.onNext("Hello world!"); subscriber.onCompleted();}).forEach(System.out::println);

Synchronoussingle value

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.create(subscriber -> { subscriber.onNext("Hello"); subscriber.onNext("world"); subscriber.onNext("!"); subscriber.onCompleted();}).forEach(System.out::println);

Synchronousmulti-value

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.create(subscriber -> { try { subscriber.onNext(doSomething()); subscriber.onCompleted(); } catch (Throwable e) { subscriber.onError(e); }}).subscribeOn(Schedulers.io()) .forEach(System.out::println);

Asynchronoussingle valuewith error handling

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.create(subscriber -> { int i = 0; while (!subscriber.isUnsubscribed()) { subscriber.onNext(i++); }}).take(10).forEach(System.out::println);

Synchronousmulti-valuewith unsubscribe

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.create(subscriber -> { AtomicInteger i = new AtomicInteger(); AtomicLong requested = new AtomicLong(); subscriber.setProducer(r -> { if (requested.getAndAdd(r) == 0) { do { if (subscriber.isUnsubscribed()) { break; } subscriber.onNext(i.incrementAndGet()); } while (requested.decrementAndGet() > 0); } });}).observeOn(Schedulers.newThread()) .take(10).forEach(System.out::println);

Synchronous multi-valued source that thread-hopswith reactive pull backpressure

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

SingleMultipleSyncT getData()Iterable getData()AsyncFuture getData()Observable getData()

Observable.from(iterable) .observeOn(Schedulers.newThread()) .take(10).forEach(System.out::println);Synchronous multi-valued source that thread-hopswith reactive pull backpressure

RxJava is a port of Microsofts Rx (Reactive Extensions) to Java that attempts to be polyglot by targeting the JVM rather than just Java the language.

Netflix is a subscription service for movies and TV shows for $7.99USD/month (about the same converted price in each countries local currency).

More information about the re-architecture can be found at http://techblog.netflix.com/2013/01/optimizing-netflix-api.html

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Abstract Concurrency2014 Couchbase, Inc. Proprietary and Confidential13

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Observable b = Observable.flatMap({ T t -> Observable r = ... transform t ...return r;})flatMap

A single flattened Observable is returned instead of Observable

Observable b = Observable.flatMap({ T t -> Observable r = ... transform t ...return r;})flatMap

A single flattened Observable is returned instead of Observable

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Observable.zip(a, b, { a, b, -> ... operate on values from both a & b ...return [a, b]; // i.e. return tuple})

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

The merge operator is used to combine multiple Observable sequences of the same type into a single Observable sequence with all data.

The X represents an onError call that would terminate the sequence so once it occurs the merged Observable also ends. The mergeDelayError operator allows delaying the error until after all other values are successfully merged.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

class VideoService { def VideoList getPersonalizedListOfMovies(userId); def VideoBookmark getBookmark(userId, videoId); def VideoRating getRating(userId, videoId); def VideoMetadata getMetadata(videoId);}class VideoService { def Observable getPersonalizedListOfMovies(userId); def Observable getBookmark(userId, videoId); def Observable getRating(userId, videoId); def Observable getMetadata(videoId);}... create an observable api:instead of a blocking api ...

With Rx blocking APIs could be converted into Observable APIs and accomplish our architecture goals including abstracting away the control and implementation of concurrency and asynchronous execution.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}123456

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

123456

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Non-Opinionated Concurrency2014 Couchbase, Inc. Proprietary and Confidential37

One of the other positives of Rx Observable was that it is abstracted from the source of concurrency. It is not opinionated and allows the implementation to decide. For example, an Observable API could just use the calling thread to synchronously execute and respond.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Or it could use a thread-pool to do the work asynchronously and callback with that thread.

Or it could use multiple threads, each thread calling back via onNext(T) when the value is ready.

Or it could use an actor pattern instead of a thread-pool.

Or NIO with an event-loop.

public Observable handle(HttpServerRequest request, HttpServerResponse response) { // first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); }).flatMap(data -> { // output as SSE as we get back the data (no waiting until all is done) return response.writeAndFlush(new ServerSentEvent(SimpleJson.mapToJson(data))); });}12346

Within a single request we now must achieve at least the same level of concurrency as previously achieved by the parallel network requests and preferably better as we can leverage the power of server hardware, lower latency network communication and eliminate redundant calls performed per incoming request.

Or a thread-pool/actor that does the work but then performs the callback via an event-loop so the thread-pool/actor is tuned for IO and event-loop for CPU.

All of these different implementation choices are possible without changing the signature of the method and without the calling code changing their behavior or how they interact with or compose responses.

// first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); })Decouples Consumption from Production

// first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); })Decouples Consumption from Production

// first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); })Decouples Consumption from Production

Decouples Consumption from Production

Decouples Consumption from Production// first request User object return new UserCommand(request.getQueryParameters().get("userId")).observe().flatMap(user -> { // then fetch personal catalog Observable catalog = new PersonalizedCatalogCommand(user).observe() .flatMap(catalogList -> { return catalogList.videos(). flatMap(video -> { Observable bookmark = new BookmarkCommand(video).observe(); Observable rating = new RatingsCommand(video).observe(); Observable metadata = new VideoMetadataCommand(video).observe(); return Observable.zip(bookmark, rating, metadata, (b, r, m) -> { return combineVideoData(video, b, r, m); }); }); }); // and fetch social data in parallel Observable social = new SocialCommand(user).observe().map(s -> { return s.getDataAsMap(); }); // merge the results return Observable.merge(catalog, social); })12345

class VideoService { def Observable getPersonalizedListOfMovies(userId); def Observable getBookmark(userId, videoId); def Observable getRating(userId, videoId); def Observable getMetadata(videoId);}

Clear API Communicates Potential Cost

Implementation Can Differclass VideoService { def Observable getPersonalizedListOfMovies(userId); def Observable getBookmark(userId, videoId); def Observable getRating(userId, videoId); def Observable getMetadata(videoId);}

BIO Network Call

Local Cache

CollapsedNetwork Call

class VideoService { def Observable getPersonalizedListOfMovies(userId); def Observable getBookmark(userId, videoId); def Observable getRating(userId, videoId); def Observable getMetadata(videoId);}

CollapsedNetwork CallCollapsedNetwork CallBIO NIO Network CallLocal CacheImplementation Can Differ and Change

Retrieval, Transformation, Combination all done in same declarative manner

Couchbase Java Client 2.0Reactive from top to bottom

Java SDK 2.02014 Couchbase, Inc. Proprietary and Confidential56It is a complete rewrite compared to 1.* and provides asynchronous & synchronous document oriented APIs.

Couchbase Core IO2014 Couchbase, Inc. Proprietary and Confidential57Common infrastructure & feature set for all language bindingsMessage orientedAsynchronous onlyLow overhead and performance focusedSupports Java 6+ (including 8!)

Disruptor RingBuffer for implicit batching and backpressureNetty for high performance IO

The Big Picture2014 Couchbase, Inc. Proprietary and Confidential58

Connecting2014 Couchbase, Inc. Proprietary and Confidential59

From Sync to Async2014 Couchbase, Inc. Proprietary and Confidential60

Connecting Async2014 Couchbase, Inc. Proprietary and Confidential61

Storing a Document2014 Couchbase, Inc. Proprietary and Confidential62

Loading a Document2014 Couchbase, Inc. Proprietary and Confidential63

Querying2014 Couchbase, Inc. Proprietary and Confidential64

Querying2014 Couchbase, Inc. Proprietary and Confidential65

Stability Patterns

Reacting to Failure2014 Couchbase, Inc. Proprietary and Confidential67Resiliency is keyThings will go wrong, so better plan for itDo not aim for QA, aim for production

Do not trust integration pointsTreat the Database (SDK) as an integration pointThe more you sweat in peace, the less you bleed in war.

Useful Reading2014 Couchbase, Inc. Proprietary and Confidential68Release It! by Michael T. Nygard

Stability Patterns & AntipatternsCapacity planningOperations

Timeouts2014 Couchbase, Inc. Proprietary and Confidential69The network is unreliableServers failThe SDK contains bugs

Always specify timeouts and deal with them!The synchronous wrapper defines them for you all the time.

Timeouts: Simple2014 Couchbase, Inc. Proprietary and Confidential70

Timeouts: Synchronous API2014 Couchbase, Inc. Proprietary and Confidential71

Timeouts: Complex Example2014 Couchbase, Inc. Proprietary and Confidential72

Coordinated Retry2014 Couchbase, Inc. Proprietary and Confidential73Fail fast

Dont let your system get stuckFail and backpressure

Retry

immediately wont helpeither linear or exponential backoff necessaryhave a strategy if retry also doesnt work (Fallbacks!)

Coordinated Retry: Fallback2014 Couchbase, Inc. Proprietary and Confidential74

Coordinated Retry with Delay2014 Couchbase, Inc. Proprietary and Confidential75

More Patterns2014 Couchbase, Inc. Proprietary and Confidential76Circuit Breaker

Closed if everything okayOpens once the integration point breaksFail fast if openRechecking with half-open

Bulkheads

Isolate failure where possibleDont let your ship sink!

QAThanks!