jacobfulano commited on
Commit
cef93d0
·
1 Parent(s): 1a9175e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -7
README.md CHANGED
@@ -11,20 +11,31 @@ pinned: false
11
 
12
  MosaicML’s mission is to make efficient training of ML models accessible.
13
  We continually productionize state-of-the-art research on efficient model training, and study the
14
- combinations of these methods in order to ensure that model training is ✨ as efficient as possible ✨.
15
  These findings are baked into our highly efficient model training stack, the MosaicML platform.
16
 
17
  If you have questions, please feel free to reach out to us on [Twitter](https://twitter.com/mosaicml),
18
  [Email]([email protected]), or join our [Slack channel](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)!
19
 
20
 
 
21
 
22
- # [Composer Library]()
23
-
24
- The open source Composer library makes it easy to train models faster at the algorithmic level.
25
  Use our collection of speedup methods in your own training loop or—for the best experience—with our Composer trainer.
26
 
27
- # [StreamingDataset]()
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
  Fast, accurate streaming of training data from cloud storage. We built StreamingDataset to make training on large datasets from cloud storage as fast, cheap, and scalable as possible.
30
 
@@ -36,6 +47,5 @@ With support for major cloud storage providers (AWS, OCI, and GCS are supported
36
  and designed as a drop-in replacement for your PyTorch IterableDataset class, StreamingDataset seamlessly integrates
37
  into your existing training workflows.
38
 
39
- # MosaicML Platform
40
-
41
 
 
11
 
12
  MosaicML’s mission is to make efficient training of ML models accessible.
13
  We continually productionize state-of-the-art research on efficient model training, and study the
14
+ combinations of these methods in order to ensure that model training is ✨ as optimized as possible ✨.
15
  These findings are baked into our highly efficient model training stack, the MosaicML platform.
16
 
17
  If you have questions, please feel free to reach out to us on [Twitter](https://twitter.com/mosaicml),
18
  [Email]([email protected]), or join our [Slack channel](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)!
19
 
20
 
21
+ # [Composer Library](https://github.com/mosaicml/composer)
22
 
23
+ The open source Composer library makes it easy to train models faster at the algorithmic level. It is built on top of PyTorch.
 
 
24
  Use our collection of speedup methods in your own training loop or—for the best experience—with our Composer trainer.
25
 
26
+ # [MosaicML Examples Repo](https://github.com/mosaicml/examples)
27
+
28
+ This repo contains reference examples for training ML models quickly and to high accuracy. It's designed to be easily forked and modified.
29
+
30
+ It currently features the following examples:
31
+
32
+ * [ResNet-50 + ImageNet](https://github.com/mosaicml/examples#resnet-50--imagenet)
33
+ * [DeeplabV3 + ADE20k](https://github.com/mosaicml/examples#deeplabv3--ade20k)
34
+ * [GPT / Large Language Models](https://github.com/mosaicml/examples#large-language-models-llms)
35
+ * [BERT](https://github.com/mosaicml/examples#bert)
36
+
37
+
38
+ # [StreamingDataset](https://github.com/mosaicml/streaming)
39
 
40
  Fast, accurate streaming of training data from cloud storage. We built StreamingDataset to make training on large datasets from cloud storage as fast, cheap, and scalable as possible.
41
 
 
47
  and designed as a drop-in replacement for your PyTorch IterableDataset class, StreamingDataset seamlessly integrates
48
  into your existing training workflows.
49
 
50
+ # [MosaicML Platform for Multinode Orchestration](https://mcli.docs.mosaicml.com/en/latest/getting_started/installation.html)
 
51