Skip to content

Commit

Permalink
Merge branch 'master' into fix-channel-topic-error-message
Browse files Browse the repository at this point in the history
  • Loading branch information
pditommaso committed Jul 30, 2024
2 parents b8f4910 + 96ec4de commit 71c4184
Show file tree
Hide file tree
Showing 21 changed files with 174 additions and 121 deletions.
36 changes: 18 additions & 18 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -1,22 +1,22 @@
name: Docs CI
on:
pull_request:
types: [opened, reopened, synchronize]
paths:
- "docs/**"
workflow_dispatch:
pull_request:
types: [opened, reopened, synchronize]
paths:
- 'docs/**'
workflow_dispatch:
jobs:
docs-build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: "3.8"
docs-build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: '3.9'

- name: Test docs build
run: |
cd docs/
pip install -r requirements.txt
make clean html
- name: Test docs build
run: |
cd docs/
pip install -r requirements.txt
make clean html
19 changes: 7 additions & 12 deletions docs/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,21 +1,16 @@
FROM mambaorg/micromamba:1.5.8
FROM mambaorg/micromamba:1.3.1

LABEL maintainer="Ben Sherman <bentshermann@gmail.com>"

ARG MAKE_VERSION=4.3
# Note netlify latest currently supported python version is 3.8
ARG PYTHON_VERSION=3.8
ARG GIT_VERSION=2.45.2
MAINTAINER Ben Sherman <bentshermann@gmail.com>

RUN micromamba install --yes --name base --channel conda-forge \
make=${MAKE_VERSION} \
python=${PYTHON_VERSION} \
conda-forge:git=${GIT_VERSION} && \
make=4.3 \
python=3.7 \
conda-forge:git=2.45.0 \
&& \
micromamba clean --all --yes

COPY requirements.txt .

RUN eval "$(micromamba shell hook --shell=bash)" && \
micromamba activate && \
pip install -r requirements.txt && \
micromamba clean --all --yes
pip install -r requirements.txt
9 changes: 6 additions & 3 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,10 @@

Nextflow documentation is written using [Sphinx](http://www.sphinx-doc.org/), [MyST](https://myst-parser.readthedocs.io/en/latest/) which is an extended version of Markdown for Sphinx, and the [Read The Docs theme for Sphinx](https://github.com/readthedocs/sphinx_rtd_theme).


## Dependencies

The most convenient approach is to create a Conda environment with Python 3.8 (other versions may work but haven't been tested).
The most convenient approach is to create a Conda environment with Python 3.7 (other versions may work but haven't been tested).

The build dependencies can be installed with `pip`:

Expand All @@ -15,6 +16,7 @@ pip install -r requirements.txt

Alternatively, you can use the Dockerfile to build the docs in a container (see below).


## Contributing

To edit and contribute to the documentation, you only need a text editor to change the appropriate `.md` files in this directory.
Expand All @@ -28,8 +30,8 @@ make clean html
Alternatively, you can use the Dockerfile to build the docs in a container:

```bash
docker build -t nextflow/sphinx:7.3.7 .
docker run -v $(pwd):/tmp nextflow/sphinx:7.3.7 -- make html
docker build -t nextflow/sphinx:5.3.0 .
docker run -v $(pwd):/tmp nextflow/sphinx:5.3.0 -- make html
```

Then start up a local http server and open `localhost:8080` in your browser to verify the changes:
Expand All @@ -38,6 +40,7 @@ Then start up a local http server and open `localhost:8080` in your browser to v
python -m http.server 8080 --directory _build/html/
```


## License

Nextflow documentation is distributed under
Expand Down
6 changes: 4 additions & 2 deletions docs/aws.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,7 @@ The `aws` command can be made available by either (1) installing it in the conta
To configure your pipeline for AWS Batch:

1. Specify the AWS Batch {ref}`executor <awsbatch-executor>`
2. Specify one or more AWS Batch queues with the {ref}`process-queue` directive
2. Specify the AWS Batch queue with the {ref}`process-queue` directive
3. Specify any Batch job container options with the {ref}`process-containerOptions` directive.

An example `nextflow.config` file is shown below:
Expand All @@ -212,7 +212,9 @@ aws {
}
```

Different queues bound to the same or different Compute Environments can be configured according to each process' requirements.
:::{tip}
Each process can be configured with its own queue by using the {ref}`process-queue` directive in the process definition or via {ref}`config-process-selectors` in your Nextflow configuration.
:::

## Container Options

Expand Down
37 changes: 17 additions & 20 deletions docs/azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Once the Blob Storage credentials are set, you can access the files in the blob

## Azure File Shares

_New in `nf-azure` version `0.11.0`_
*New in `nf-azure` version `0.11.0`*

Nextflow has built-in support also for [Azure Files](https://azure.microsoft.com/en-us/services/storage/files/). Files available in the serverless Azure File shares can be mounted concurrently on the nodes of a pool executing the pipeline. These files become immediately available in the file system and can be referred as local files within the processes. This is especially useful when a task needs to access large amounts of data (such as genome indexes) during its execution. An arbitrary number of File shares can be mounted on each pool node.

Expand Down Expand Up @@ -143,7 +143,6 @@ The list of Azure regions can be found by executing the following Azure CLI comm
```bash
az account list-locations -o table
```

:::

Finally, launch your pipeline with the above configuration:
Expand All @@ -164,13 +163,13 @@ By default, the `cpus` and `memory` directives are used to find the smallest mac

To specify multiple Azure machine families, use a comma separated list with glob (`*`) values in the `machineType` directive. For example, the following will select any machine size from D or E v5 machines, with additional data disk, denoted by the `d` suffix:

```groovy
```config
process.machineType = "Standard_D*d_v5,Standard_E*d_v5"
```

For example, the following process will create a pool of `Standard_E4d_v5` machines based when using `autoPoolMode`:

```groovy
```nextflow
process EXAMPLE_PROCESS {
machineType "Standard_E*d_v5"
cpus 16
Expand Down Expand Up @@ -217,7 +216,6 @@ Error executing process > '<process name> (1)'
Caused by:
Azure Batch pool '<pool name>' not in active state
```

:::

### Named pools
Expand Down Expand Up @@ -266,7 +264,6 @@ azure {
}
}
```

:::

### Requirements on pre-existing named pools
Expand Down Expand Up @@ -325,28 +322,28 @@ If you need a different strategy, you can provide your own formula using the `sc

When Nextflow creates a pool of compute nodes, it selects:

- the virtual machine image reference to be installed on the node
- the Batch node agent SKU, a program that runs on each node and provides an interface between the node and the Batch service
- the virtual machine image reference to be installed on the node
- the Batch node agent SKU, a program that runs on each node and provides an interface between the node and the Batch service

Together, these settings determine the Operating System and version installed on each node.

By default, Nextflow creates pool nodes based on CentOS 8, but this behavior can be customised in the pool configuration. Below are configurations for image reference/SKU combinations to select two popular systems.

- Ubuntu 20.04 (default):
- Ubuntu 20.04 (default):

```groovy
azure.batch.pools.<name>.sku = "batch.node.ubuntu 20.04"
azure.batch.pools.<name>.offer = "ubuntu-server-container"
azure.batch.pools.<name>.publisher = "microsoft-azure-batch"
```
```groovy
azure.batch.pools.<name>.sku = "batch.node.ubuntu 20.04"
azure.batch.pools.<name>.offer = "ubuntu-server-container"
azure.batch.pools.<name>.publisher = "microsoft-azure-batch"
```

- CentOS 8:
- CentOS 8:

```groovy
azure.batch.pools.<name>.sku = "batch.node.centos 8"
azure.batch.pools.<name>.offer = "centos-container"
azure.batch.pools.<name>.publisher = "microsoft-azure-batch"
```
```groovy
azure.batch.pools.<name>.sku = "batch.node.centos 8"
azure.batch.pools.<name>.offer = "centos-container"
azure.batch.pools.<name>.publisher = "microsoft-azure-batch"
```

In the above snippet, replace `<name>` with the name of your Azure node pool.

Expand Down
2 changes: 1 addition & 1 deletion docs/google.md
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ The `disk` directive can be used to override the disk requested by Fusion. See t

### Supported directives

The integration with Google Batch is a developer preview feature. Currently, the following Nextflow directives are supported:
Currently, the following Nextflow directives are supported by the Google Batch executor:

- {ref}`process-accelerator`
- {ref}`process-container`
Expand Down
3 changes: 0 additions & 3 deletions docs/netlify.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,5 @@
[[headers]]
for = "/*"

[build.environment]
PYTHON_VERSION = "3.8"

[headers.values]
X-Robots-Tag = "noindex"
16 changes: 4 additions & 12 deletions docs/process.md
Original file line number Diff line number Diff line change
Expand Up @@ -2370,18 +2370,10 @@ process grid_job {
}
```

Multiple queues can be specified by separating their names with a comma for example:

```groovy
process grid_job {
queue 'short,long,cn-el6'
executor 'sge'
"""
your task script here
"""
}
```
:::{tip}
Grid executors allow specifying multiple queue names separating them with a comma e.g. `queue 'short,long,cn-el6'`.
However, this does not generally apply to other executors such as AWS Batch, Azure Batch, Google Batch.
:::

:::{note}
This directive is only used by certain executors. Refer to the {ref}`executor-page` page to see which executors support this directive.
Expand Down
5 changes: 0 additions & 5 deletions docs/requirements-top-level.txt

This file was deleted.

36 changes: 4 additions & 32 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,33 +1,5 @@
alabaster==0.7.13
Babel==2.15.0
certifi==2024.7.4
charset-normalizer==3.3.2
docutils==0.20.1
idna==3.7
imagesize==1.4.1
importlib-metadata==8.0.0
jinja2==3.1.4
markdown-it-py==3.0.0
MarkupSafe==2.1.5
mdit-py-plugins==0.4.1
mdurl==0.1.2
myst-parser==3.0.1
packaging==24.1
pygments==2.18.0
pytz==2024.1
PyYAML==6.0.1
requests==2.32.3
snowballstemmer==2.2.0
sphinx==7.1.2
sphinx-rtd-theme==2.0.0
sphinxcontrib-applehelp==1.0.4
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==2.0.1
sphinxcontrib-jquery==4.1
sphinxcontrib-jsmath==1.0.1
myst-parser==0.18.1
sphinx==5.3.0
sphinx-rtd-theme==1.1.1
sphinxcontrib-mermaid==0.9.2
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.5
sphinxext-rediraffe==0.2.7
urllib3==2.2.2
zipp==3.19.2
sphinxext-rediraffe==0.2.7
2 changes: 1 addition & 1 deletion gradle/wrapper/gradle-wrapper.properties
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.8-bin.zip
distributionUrl=https\://services.gradle.org/distributions/gradle-8.9-bin.zip
networkTimeout=10000
validateDistributionUrl=true
zipStoreBase=GRADLE_USER_HOME
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -546,6 +546,7 @@ class K8sDriverLauncher {
.withEnv( PodEnv.value('NXF_ANSI_LOG', 'false'))
.withMemory(headMemory?:"")
.withCpus(headCpus)
.withCpuLimits(k8sConfig.cpuLimitsEnabled())

if ( k8sConfig.useJobResource()) {
this.resourceType = ResourceType.Job
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -286,6 +286,9 @@ class PluginExtensionProvider implements ExtensionProvider {
def factory = (ChannelFactoryInstance)reference.target
return factory.invokeExtensionMethod(reference.method, args)
}
else {
throw new MissingMethodException("Channel.${name}", Object.class, args)
}
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ import nextflow.Const
import nextflow.ast.NextflowDSLImpl
import nextflow.exception.AbortOperationException
import nextflow.exception.FailedGuardException
import nextflow.exception.ProcessUnrecoverableException
import nextflow.executor.BashWrapperBuilder
import nextflow.executor.res.AcceleratorResource
import nextflow.executor.res.DiskResource
Expand Down Expand Up @@ -256,7 +257,7 @@ class TaskConfig extends LazyMap implements Cloneable {
new MemoryUnit(value.toString().trim())
}
catch( Exception e ) {
throw new AbortOperationException("Not a valid 'memory' value in process definition: $value")
throw new ProcessUnrecoverableException("Not a valid 'memory' value in process definition: $value")
}
}

Expand Down Expand Up @@ -334,7 +335,11 @@ class TaskConfig extends LazyMap implements Cloneable {

int getCpus() {
final val = getCpus0()
if( val<0 )
throw new ProcessUnrecoverableException("Directive 'cpus' cannot be a negative value - offending value: $val")
final lim = getResourceLimit('cpus') as Integer
if( lim!=null && lim<1 )
throw new ProcessUnrecoverableException("Directive 'resourceLimits.cpus' cannot be a negative value - offending value: $lim")
return val && lim && val > lim ? lim : val
}

Expand Down
Loading

0 comments on commit 71c4184

Please sign in to comment.