Skip to content

Commit

Permalink
[fix] remove shardconfig
Browse files Browse the repository at this point in the history
  • Loading branch information
duanjunwen committed Dec 18, 2024
1 parent 6bbe666 commit 3946366
Show file tree
Hide file tree
Showing 2 changed files with 0 additions and 4 deletions.
2 changes: 0 additions & 2 deletions docs/source/en/features/shardformer.md
Original file line number Diff line number Diff line change
Expand Up @@ -213,8 +213,6 @@ The support matrix will grow larger as more models and optimization tools emerge

The configuration of Shardformer is controlled by class `ShardConfig`:

{{ autodoc:colossalai.shardformer.shard.shard_config }}

If you want to enable Apex Fused Layernorm, please install `apex`.
If you want to enable the usage of flash attention, please install `flash_attn`.
In addition, xFormers's `cutlass_op` can serve as a backup for flash attention.
Expand Down
2 changes: 0 additions & 2 deletions docs/source/zh-Hans/features/shardformer.md
Original file line number Diff line number Diff line change
Expand Up @@ -209,8 +209,6 @@ Author: [Baizhou Zhang](https://github.com/Fridge003), [Bin Jia](https://github.

Shardformer的配置由类`ShardConfig`的参数控制:

{{ autodoc:colossalai.shardformer.shard.shard_config }}

如果您想启用 Apex Fused Layernorm,请安装 `apex`。如果您想启用 flash attention,请安装 `flash_attn`。此外,xFormers 的 `cutlass_op` 可以作为Flash Attention的补充优化方式。

### 启动Shardformer
Expand Down

0 comments on commit 3946366

Please sign in to comment.