From 9123089b770e5f7c647583ec83c09ca0d0589a19 Mon Sep 17 00:00:00 2001 From: Ganesan Ramalingam Date: Mon, 18 Nov 2024 15:45:40 -0800 Subject: [PATCH] Fix spacing --- docs/proposals/ShardingFormalism.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/proposals/ShardingFormalism.md b/docs/proposals/ShardingFormalism.md index fb3266cad54..290fd4b7d1a 100644 --- a/docs/proposals/ShardingFormalism.md +++ b/docs/proposals/ShardingFormalism.md @@ -101,6 +101,7 @@ In the special case where all corresponding input axes have a size of 1, the out the same sharding (that is, replicated across all devices of the node op). **Composing Sharding Specs on Different Axes** + Consider the example of an `Add (Input1, Input2)` op. Consider the case where `Input1` has shape `[M, 1]` and `Input2` has shape `[1, N]`. The output has shape `[M, N]`, as a result of broadcasting.