spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shixiong Zhu (JIRA)" <>
Subject [jira] [Commented] (SPARK-5124) Standardize internal RPC interface
Date Fri, 06 Mar 2015 02:20:38 GMT


Shixiong Zhu commented on SPARK-5124:

1. For the local Endpoint, let's put it in a future change.
2. RpcCallContext looks good to me. @rxin, thoughts? 
3. For `replyWithSender`, I copied my answer in the pr here:

For example, RpcEndpoint A calls sendWithReply to send a message to RpcEndpoint B. In B's
receiveAndReply, it may have two requirements.

Reply to A but don't need a reply. It means, the replied message should be sent to receive
method of A. (reply)
Reply to A but need a reply. It means, the replied message should be sent to receiveAndReply
method of A. (replyWithSender)
That's why here we need two methods.

> Standardize internal RPC interface
> ----------------------------------
>                 Key: SPARK-5124
>                 URL:
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Reynold Xin
>            Assignee: Shixiong Zhu
>         Attachments: Pluggable RPC - draft 1.pdf, Pluggable RPC - draft 2.pdf
> In Spark we use Akka as the RPC layer. It would be great if we can standardize the internal
RPC interface to facilitate testing. This will also provide the foundation to try other RPC
implementations in the future.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message