Machine-translated version of the BFCL-v2 benchmark.
Developer:
- Dan Saattrup Smart (dan.smart@alexandra.dk)
- Run
make install, which sets up a virtual environment and all Python dependencies therein. - Run
source .venv/bin/activateto activate the virtual environment.
To install new PyPI packages, run:
uv add <package-name>To remove them again, run:
uv remove <package-name>To show all installed packages, run:
uv pip listThe project includes the following convenience commands:
make install: Install the project and its dependencies in a virtual environment.make install-pre-commit: Install pre-commit hooks for linting, formatting and type checking.make check: Lint and format the code usingruff, and type check usingpyrefly.make test: Run tests usingpytestand update the coverage badge in the readme.make docker: Build a Docker image and run the Docker container.make docs: View documentation locally in a browser.make publish-docs: Publish documentation to GitHub Pages.make tree: Show the project structure as a tree.
In the src directory there are two subdirectories, multi_bfcl
and scripts. This is a brief explanation of the differences between the two.
All Python files in the multi_bfcl directory are modules
internal to the project package. Examples here could be a general data loading script,
a definition of a model, or a training function. Think of modules as all the building
blocks of a project.
When a module is importing functions/classes from other modules we use the relative import notation - here's an example:
from .other_module import some_functionPython files in the scripts folder are scripts, which are short code snippets that
are external to the project package, and which is meant to actually run the code. As
such, only scripts will be called from the terminal. An analogy here is that the
internal numpy code are all modules, but the Python code you write where you import
some numpy functions and actually run them, that a script.
When importing module functions/classes when you're in a script, you do it like you would normally import from any other package:
from multi_bfcl import some_functionNote that this is also how we import functions/classes in tests, since each test Python file is also a Python script, rather than a module.
A Dockerfile is included in the new repositories, which by default runs
src/scripts/main.py. You can build the Docker image and run the Docker container by
running make docker.
Run make docs to create the documentation in the docs folder, which is based on
your docstrings in your code. You can publish this documentation to Github Pages by
running make publish-docs. To add more manual documentation pages, simply add more
Markdown files to the docs directory; this will automatically be included in the
documentation.
Run make test to test your code, which also updates the "coverage badge" in the
README, showing you how much of your code base that is currently being tested.
Github CI pipelines are included in the repo, running all the tests in the tests
directory, as well as building online documentation, if Github Pages has been enabled
for the repository (can be enabled on Github in the repository settings).
Code Spaces is a new feature on Github, that allows you to develop on a project
completely in the cloud, without having to do any local setup at all. This repo comes
included with a configuration file for running code spaces on Github. When hosted on
alexandrainst/multi_bfcl then simply press the <> Code button
and add a code space to get started, which will open a VSCode window directly in your
browser.