Skip to content

Commit

Permalink
Merge pull request #403 from AZ-AI/docs_os
Browse files Browse the repository at this point in the history
Docs os
  • Loading branch information
RichJackson authored Dec 9, 2022
2 parents 58b5305 + 34e7c96 commit b8c6e64
Show file tree
Hide file tree
Showing 31 changed files with 318 additions and 506 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ with initialize_config_dir(config_dir=str(cdir)):

# Documentation

[Find our docs here](https://psychic-chainsaw-f197cc2b.pages.github.io/_build/html/index.html)
[Find our docs here](https://astrazeneca.github.io/KAZU/_build/html/index.html)

## License

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
kazu.modelling.ontology\_preprocessing.base
kazu.modelling.ontology\_preprocessing.base
===========================================

.. automodule:: kazu.modelling.ontology_preprocessing.base
Expand All @@ -19,6 +19,7 @@ kazu.modelling.ontology\_preprocessing.base

BiologicalProcessGeneOntologyParser
CLOOntologyParser
CLOntologyParser
CellosaurusOntologyParser
CellularComponentGeneOntologyParser
ChemblOntologyParser
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -444,7 +444,7 @@
<dd><p>Implement one or more PyTorch DataLoaders for training.</p>
<dl class="simple">
<dt>Return:</dt><dd><p>A collection of <a class="reference external" href="https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader" title="(in PyTorch v1.13)"><code class="xref py py-class docutils literal notranslate"><span class="pre">torch.utils.data.DataLoader</span></code></a> specifying training samples.
In the case of multiple dataloaders, please see this <a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/guides/data.html#multiple-dataloaders" title="(in PyTorch Lightning v1.8.3.post1)"><span class="xref std std-ref">section</span></a>.</p>
In the case of multiple dataloaders, please see this <a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/guides/data.html#multiple-dataloaders" title="(in PyTorch Lightning v1.8.4)"><span class="xref std std-ref">section</span></a>.</p>
</dd>
</dl>
<p>The dataloader you return will not be reloaded unless you set
Expand All @@ -463,7 +463,7 @@
<p>do not assign state in prepare_data</p>
</div>
<ul class="simple">
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.fit" title="(in PyTorch Lightning v1.8.3.post1)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">fit()</span></code></a></p></li>
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.fit" title="(in PyTorch Lightning v1.8.4)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">fit()</span></code></a></p></li>
<li><p><code class="xref py py-meth docutils literal notranslate"><span class="pre">prepare_data()</span></code></p></li>
<li><p><code class="xref py py-meth docutils literal notranslate"><span class="pre">setup()</span></code></p></li>
</ul>
Expand Down Expand Up @@ -529,8 +529,8 @@
a positive integer.</p>
<p>It’s recommended that all data downloads and preparation happen in <code class="xref py py-meth docutils literal notranslate"><span class="pre">prepare_data()</span></code>.</p>
<ul class="simple">
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.fit" title="(in PyTorch Lightning v1.8.3.post1)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">fit()</span></code></a></p></li>
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.validate" title="(in PyTorch Lightning v1.8.3.post1)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">validate()</span></code></a></p></li>
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.fit" title="(in PyTorch Lightning v1.8.4)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">fit()</span></code></a></p></li>
<li><p><a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html#pytorch_lightning.trainer.trainer.Trainer.validate" title="(in PyTorch Lightning v1.8.4)"><code class="xref py py-meth docutils literal notranslate"><span class="pre">validate()</span></code></a></p></li>
<li><p><code class="xref py py-meth docutils literal notranslate"><span class="pre">prepare_data()</span></code></p></li>
<li><p><code class="xref py py-meth docutils literal notranslate"><span class="pre">setup()</span></code></p></li>
</ul>
Expand Down Expand Up @@ -1108,7 +1108,7 @@
<dl class="py class">
<dt class="sig sig-object py" id="kazu.modelling.distillation.models.TaskSpecificDistillation">
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">kazu.modelling.distillation.models.</span></span><span class="sig-name descname"><span class="pre">TaskSpecificDistillation</span></span><a class="reference external" href="https://github.com/AZ-AI/kazu/blob/main/kazu/modelling/distillation/models.py#L143-L244"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#kazu.modelling.distillation.models.TaskSpecificDistillation" title="Permalink to this definition">#</a></dt>
<dd><p>Bases: <a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.LightningModule.html#pytorch_lightning.core.LightningModule" title="(in PyTorch Lightning v1.8.3.post1)"><code class="xref py py-class docutils literal notranslate"><span class="pre">LightningModule</span></code></a></p>
<dd><p>Bases: <a class="reference external" href="https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.LightningModule.html#pytorch_lightning.core.LightningModule" title="(in PyTorch Lightning v1.8.4)"><code class="xref py py-class docutils literal notranslate"><span class="pre">LightningModule</span></code></a></p>
<dl class="py method">
<dt class="sig sig-object py" id="kazu.modelling.distillation.models.TaskSpecificDistillation.__init__">
<span class="sig-name descname"><span class="pre">__init__</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">temperature</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">warmup_steps</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">learning_rate</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">weight_decay</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">batch_size</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">accumulate_grad_batches</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">max_epochs</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">schedule</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="reference external" href="https://github.com/AZ-AI/kazu/blob/main/kazu/modelling/distillation/models.py#L144-L189"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#kazu.modelling.distillation.models.TaskSpecificDistillation.__init__" title="Permalink to this definition">#</a></dt>
Expand Down
Loading

0 comments on commit b8c6e64

Please sign in to comment.