Skip to content

Commit

Permalink
Merge branch 'master' into feature/newFilter
Browse files Browse the repository at this point in the history
# Conflicts:
#	neo/core/container.py
#	neo/test/coretest/test_container.py
  • Loading branch information
Moritz-Alexander-Kern committed Jul 21, 2023
2 parents b3800ae + 6ce00dc commit 17666ac
Show file tree
Hide file tree
Showing 207 changed files with 28,100 additions and 2,263 deletions.
28 changes: 28 additions & 0 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
---
name: Bug report
about: Create a report to help us fix problems
title: ''
labels: bug
assignees: ''

---

**Describe the bug**
A clear and concise description of what the bug is.

**To Reproduce**
Steps to reproduce the behaviour, preferably providing a simple code example (if the error happens in the middle of some complex code, please try to find a simpler, minimal example that demonstrates the error), and showing the full traceback.

If the error occurs when reading a file that you can't share publicly, please let us know, and we'll get in touch to discuss sharing it privately.

**Expected behaviour**
If the bug is incorrect behaviour, rather than an unexpected Exception, please give a clear and concise description of what you expected to happen.

**Environment:**
- OS: [e.g. macOS, Linux, Windows]
- Python version
- Neo version
- NumPy version

**Additional context**
Add any other context about the problem here.
17 changes: 17 additions & 0 deletions .github/ISSUE_TEMPLATE/confusing-documentation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---
name: Confusing documentation
about: Let us know if the documentation is confusing or incorrect
title: ''
labels: Documentation
assignees: ''

---

**Which page is the problem on?**
The URL of the documentation page where the problem is, and either copy-paste the confusing text (for a short section of text), or give the first few and last few words (for a long section).

**What is the problem?**
Is the documentation (a) confusing or (b) incorrect? In what way?

**Suggestions for fixing the problem**
If the documentation is confusing, can you suggest an improvement? If the documentation is incorrect, what should it say instead?
20 changes: 20 additions & 0 deletions .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea to improve Neo
title: ''
labels: enhancement
assignees: ''

---

**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

**Describe the solution you'd like**
A clear and concise description of what you want to happen.

**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.

**Additional context**
Add any other context or screenshots about the feature request here.
109 changes: 109 additions & 0 deletions .github/workflows/caches_cron_job.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
name: Create caches for ephy_testing_data and conda env

on:
workflow_dispatch: # Workflow can be triggered manually via GH actions webinterface
push: # When something is pushed into master this checks if caches need to re-created
branches:
- master
schedule:
- cron: "0 12 * * *" # Daily at noon UTC

jobs:

create-conda-env-cache-if-missing:
name: Caching conda env
runs-on: "ubuntu-latest"
strategy:
fail-fast: true
defaults:
# by default run in bash mode (required for conda usage)
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3

- name: Get current year-month
id: date
run: |
echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT
- name: Get current dependencies hash
id: dependencies
run: |
echo "hash=${{hashFiles('**/pyproject.toml', '**/environment_testing.yml')}}" >> $GITHUB_OUTPUT
- uses: actions/cache@v3
# the cache for python package is reset:
# * every month
# * when package dependencies change
id: cache-conda-env
with:
path: /usr/share/miniconda/envs/neo-test-env
key: ${{ runner.os }}-conda-env-${{ steps.dependencies.outputs.hash }}-${{ steps.date.outputs.date }}

- name: Cache found?
run: echo "Cache-hit == ${{steps.cache-conda-env.outputs.cache-hit == 'true'}}"

# activate environment if not restored from cache
- uses: conda-incubator/setup-miniconda@v2.2.0
if: steps.cache-conda-env.outputs.cache-hit != 'true'
with:
activate-environment: neo-test-env
environment-file: environment_testing.yml
python-version: 3.9

- name: Create the conda environment to be cached
if: steps.cache-conda-env.outputs.cache-hit != 'true'
# create conda env, configure git and install pip, neo and test dependencies from master
# for PRs that change dependencies, this environment will be updated in the test workflow
run: |
git config --global user.email "neo_ci@fake_mail.com"
git config --global user.name "neo CI"
python -m pip install -U pip # Official recommended way
pip install --upgrade -e .[test]
create-data-cache-if-missing:
name: Caching data env
runs-on: "ubuntu-latest"
steps:

- name: Get current hash (SHA) of the ephy_testing_data repo
id: ephy_testing_data
run: |
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT
- uses: actions/cache@v3
# Loading cache of ephys_testing_dataset
id: cache-datasets
with:
path: ~/ephy_testing_data
key: ${{ runner.os }}-datasets-${{ steps.ephy_testing_data.outputs.dataset_hash }}

- name: Cache found?
run: echo "Cache-hit == ${{steps.cache-datasets.outputs.cache-hit == 'true'}}"

- name: Installing datalad and git-annex
if: steps.cache-datasets.outputs.cache-hit != 'true'
run: |
git config --global user.email "neo_ci@fake_mail.com"
git config --global user.name "neo CI"
python -m pip install -U pip # Official recommended way
pip install datalad-installer
datalad-installer --sudo ok git-annex --method datalad/packages
pip install datalad
git config --global filter.annex.process "git-annex filter-process" # recommended for efficiency
- name: Download dataset
if: steps.cache-datasets.outputs.cache-hit != 'true'
# Download repository and also fetch data
run: |
cd ~
datalad install --recursive --get-data https://gin.g-node.org/NeuralEnsemble/ephy_testing_data
- name: Show size of the cache to assert data is downloaded
run: |
cd ~
pwd
du -hs ephy_testing_data
cd ephy_testing_data
pwd
30 changes: 21 additions & 9 deletions .github/workflows/core-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,17 @@ on:
pull_request:
branches: [master]
types: [synchronize, opened, reopened, ready_for_review]
paths:
- 'neo/core/**'
- 'pyproject.toml'

# run checks on any change of master, including merge of PRs
push:
branches: [master]


concurrency: # Cancel previous workflows on the same pull request
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

jobs:
multi-os-python-numpy:
Expand All @@ -19,21 +24,28 @@ jobs:
fail-fast: true
matrix:
os: ["ubuntu-latest", "windows-latest"]
# "macos-latest",
python-version: ['3.7', '3.8', '3.9']
numpy-version: ['1.16.6', '1.17.5', '1.18.5', '1.19.5', '1.20.3', '1.21.5', '1.22.3']
# "macos-latest",
python-version: ['3.8', '3.9', '3.10', '3.11']
numpy-version: ['1.19.5', '1.20.3', '1.21.6', '1.22.4', '1.23.5', '1.24.1']
exclude:
- python-version: '3.7'
numpy-version: '1.22.3'

- python-version: '3.10'
numpy-version: '1.19.5'
- python-version: '3.10'
numpy-version: '1.20.3'
- python-version: '3.11'
numpy-version: '1.19.5'
- python-version: '3.11'
numpy-version: '1.20.3'
- python-version: '3.11'
numpy-version: '1.21.6'
steps:
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

- name: Checkout repository
uses: actions/checkout@v2
uses: actions/checkout@v3

- name: Install numpy ${{ matrix.numpy-version }}
run: |
Expand Down
30 changes: 30 additions & 0 deletions .github/workflows/ebrains.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Mirror to EBRAINS

# Configure the events that are going to trigger tha automated update of the mirror
on:
push:
branches: [ master ]

# Configure what will be updated
jobs:
# set the job name
to_ebrains:
runs-on: ubuntu-latest
steps:
# this task will push the master branch of the source_repo (github) to the
# destination_repo (ebrains gitlab)
- name: syncmaster
uses: wei/git-sync@v3
with:
source_repo: https://github.com/NeuralEnsemble/python-neo
source_branch: "master"
destination_repo: "https://ghpusher:${{ secrets.EBRAINS_GITLAB_ACCESS_TOKEN }}@gitlab.ebrains.eu/NeuralEnsemble/neo.git"
destination_branch: "main"
# this task will push all tags from the source_repo to the destination_repo
- name: synctags
uses: wei/git-sync@v3
with:
source_repo: https://github.com/NeuralEnsemble/python-neo
source_branch: "refs/tags/*"
destination_repo: "https://ghpusher:${{ secrets.EBRAINS_GITLAB_ACCESS_TOKEN }}@gitlab.ebrains.eu/NeuralEnsemble/neo.git"
destination_branch: "refs/tags/*"
78 changes: 52 additions & 26 deletions .github/workflows/io-test.yml
Original file line number Diff line number Diff line change
@@ -1,80 +1,106 @@
name: NeoIoTest

on:
pull_request:
branches: [master]
types: [synchronize, opened, reopened, ready_for_review]

# run checks on any change of master, including merge of PRs
push:
branches: [master]
workflow_call:
inputs:
os:
required: true
type: string

concurrency: # Cancel previous workflows on the same pull request
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

jobs:
build-and-test:
name: Test on (${{ matrix.os }})
runs-on: ${{ matrix.os }}
name: Test on (${{ inputs.os }})
runs-on: ${{ inputs.os }}
strategy:
fail-fast: true
matrix:
# "macos-latest", "windows-latest"
os: ["ubuntu-latest", ]
python-version: ['3.8', ]
python-version: ['3.9', ]
defaults:
# by default run in bash mode (required for conda usage)
run:
shell: bash -l {0}
steps:

- name: Checkout repository
uses: actions/checkout@v2
uses: actions/checkout@v3

- name: Get current year-month
id: date
run: echo "::set-output name=date::$(date +'%Y-%m')"
run: echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT

- name: Get ephy_testing_data current head hash
# the key depend on the last commit repo https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git
id: vars
id: ephy_testing_data
run: |
echo "::set-output name=HASH_EPHY_DATASET::$(git ls-remote https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git HEAD | cut -f1)"
echo "dataset_hash=$(git ls-remote https://gin.g-node.org/NeuralEnsemble/ephy_testing_data.git HEAD | cut -f1)" >> $GITHUB_OUTPUT
- uses: actions/cache@v3
- uses: actions/cache/restore@v3
# Loading cache of ephys_testing_dataset
id: cache-datasets
with:
path: ~/ephy_testing_data
key: ${{ runner.os }}-datasets-${{ steps.vars.outputs.HASH_EPHY_DATASET }}
key: ${{ runner.os }}-datasets-${{ steps.ephy_testing_data.outputs.dataset_hash }}
restore-keys: ${{ runner.os }}-datasets-

- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v2.2.0
with:
activate-environment: neo-test-env
python-version: ${{ matrix.python-version }}
clean-patched-environment-file: false

- uses: actions/cache@v3
- name: Get current dependencies hash
id: dependencies
run: |
echo "hash=${{hashFiles('**/pyproject.toml', '**/environment_testing.yml')}}" >> $GITHUB_OUTPUT
- uses: actions/cache/restore@v3
# the cache for python package is reset:
# * every month
# * when requirements/requirements_testing change
# * when package dependencies change
id: cache-conda-env
with:
path: /usr/share/miniconda/envs/neo-test-env
key: ${{ runner.os }}-conda-env-${{ hashFiles('**/requirements.txt') }}-${{ hashFiles('**/requirements_testing.txt') }}-${{ hashFiles('**/environment_testing.txt') }}-${{ steps.date.outputs.date }}
key: ${{ runner.os }}-conda-env-${{ steps.dependencies.outputs.hash }}-${{ steps.date.outputs.date }}
# restore-keys match any key that starts with the restore-key
restore-keys: |
${{ runner.os }}-conda-env-${{ steps.dependencies.outputs.hash }}-
${{ runner.os }}-conda-env-
- name: Install testing dependencies
# testing environment is only installed if no cache was found
# testing environment is only created from yml if no cache was found
# restore-key hits should result in `cache-hit` == 'false'
if: steps.cache-conda-env.outputs.cache-hit != 'true'
run: |
conda env update neo-test-env --file environment_testing.yml
conda env update --name neo-test-env --file environment_testing.yml --prune
- name: Configure git
run: |
git config --global user.email "neo_ci@fake_mail.com"
git config --global user.name "neo CI"
- name: Install neo
- name: Install neo including dependencies
# installation with dependencies is only required if no cache was found
# restore-key hits should result in `cache-hit` == 'false'
if: steps.cache-conda-env.outputs.cache-hit != 'true'
run: |
pip install --upgrade -e .
pip install .[test]
- name: Install neo without dependencies
# only installing neo version to test as dependencies should be in cached conda env already
if: steps.cache-conda-env.outputs.cache-hit == 'true'
run: |
pip install --no-dependencies -e .
- name: Install wine
run: |
sudo rm -f /etc/apt/sources.list.d/microsoft-prod.list
sudo dpkg --add-architecture i386
sudo apt-get update -qq
sudo apt-get install -yqq --allow-downgrades libc6:i386 libgcc-s1:i386 libstdc++6:i386 wine
- name: Test with pytest
run: |
Expand Down
Loading

0 comments on commit 17666ac

Please sign in to comment.