Skip to content

MCW-BMI/mcw-date-annotatior-test

Repository files navigation

nlpsandbox.io

NLP Sandbox Date Annotator Example

GitHub Release GitHub CI GitHub License Docker Leaderboard Discord

Introduction

NLPSandbox.io is an open platform for benchmarking modular natural language processing (NLP) tools on both public and private datasets. Academics, students, and industry professionals are invited to browse the available tasks and participate by developing and submitting an NLP Sandbox tool.

This repository provides an example implementation of the NLP Sandbox Date Annotator API written in Python-Flask. An NLP Sandbox date annotator takes as input a clinical note (text) and outputs a list of predicted date annotations found in the clinical note. Here dates are identified using regular expressions.

This tool is provided to NLP developers who develop in Python as a starting point to package their own date annotator as an NLP Sandbox tool (see section Development). This section also describes how to generate a tool "stub" using openapi-generator for 50+ programming languages-frameworks. This repository includes a GitHub CI/CD workflow that lints, tests, builds and pushes a Docker image of this tool to Synapse Docker Registry. This image of this example tool can be submitted as-is on NLPSandbox.io to benchmark its performance -- just don't expect a high performance!

Contents

Specification

Requirements

Usage

Running with Docker

The command below starts this NLP Sandbox date annotator locally.

docker compose up --build

You can stop the container run with Ctrl+C, followed by docker compose down.

Running with Python

Create a Conda environment.

conda create --name date-annotator python=3.9
conda activate date-annotator

Install and start this NLP Sandbox date annotator.

cd server && pip install -r requirements.txt
python -m openapi_server

Accessing this NLP Sandbox tool User Interface

This NLP Sandbox tool provides a web interface that you can use to annotate clinical notes. This web client has been automatically generated by openapi-generator. To access the UI, open a new tab in your browser and navigate to one of the following address depending on whether you are running the tool using Docker (production) or Python (development).

Development

This section describes how to develop your own NLP Sandbox date annotator in Python-Flask and other programming languages-frameworks. This example tool is also available in Java in the GitHub repository nlpsandbox/date-annotator-example-java.

Development requirements

Creating a GitHub repository

Depending on the language-frameworks you want to develop with:

You can also use a different code repository hosting service like GitLab and Bitbucket.

Configuring the CI/CD workflow

This repository includes a GitHub CI/CD workflow that lints, tests, builds and pushes a Docker image of this tool to Synapse Docker Registry. Only the images that have been pushed to Synapse Docker Resgitry can be submitted to NLPSandbox.io benchmarks for now.

After creating your GitHub repository, you need to configure the CI/CD workflow if you want to benefit from automatic lint checks, tests and Docker builds.

  1. Create two GitHub secrets
  2. In the CI/CD workflow, update the environment variable docker_repository with the value docker.synapse.org/<synapse_project_id>/<docker_image> where:
    • <synapse_project_id>: the Synapse ID of a project you have created on Synapse.org.
    • <docker_image> is the name of your image/tool.

Enabling version updates

This repository includes a Dependabot configuration that instructs GitHub to let you know when an update is available for one of your dependencies (e.g. Python, Node, Docker). Dependabot will automatically open a PR when an update is available. If you have configured the CI/CD workflow that comes with this repository, the workflow will automatically run and notify you if the update is breaking your code. You can then resolve the issue before merging the PR, hence making the update effective.

For more information on Dependabot, please visit the GitHub page Enabling and disabling version updates.

Generating a new NLP Sandbox tool using openapi-generator

The development of new NLP Sandbox tools is streamlined by using the openapi-generator to generate tool "stubs" for more than 50 programming languages and frameworks. Here a date annotator stub refers to an initial implementation that has been automatically generated by openapi-generator from the NLP Sandbox Date Annotator API specification.

Run the command below to get the list of languages-framework supported by the openapi-generator (under the section SERVER generators).

npx @openapitools/openapi-generator-cli list

Generate the date annotator stub from an empty GitHub repository (here in Python-Flask):

mkdir server
npx @openapitools/openapi-generator-cli generate \
  -g python-flask \
  -o server \
  -i https://nlpsandbox.github.io/nlpsandbox-schemas/date-annotator/latest/openapi.json

where the option -i refers to the OpenAPI specification of the NLP Sandbox Date Annotator API.

The URL is composed of different elements:

  • date-annotator - The type of NLP Sandbox tool to generate. The list of all the NLP Sandbox tool types available is defined in the NLP Sandbox schemas.
  • latest - The latest stable version of the NLP Sandbox schemas. This token can be replaced by a specific release version x.y.z of the NLP Sandbox schemas.

Keeping your tool up-to-date

The NLP Sandbox schemas is updated after receiving contribution from the community. For example, the Patient schema may include in the future additional information that NLP Sandbox tools can leverage to generate more accurate predictions.

After an update of the NLP Sandbox schemas, NLPSandbox.io will only accept to evaluate tools that implement the latest version of the schemas. It is therefore important to keep your tools up-to-date and re-submit them so that they continue to appear in the leaderboards and to be used by the community.

This GitHub repository includes a workflow that checks daily if a new release of the NLP Sandbox schemas is available, in which case a PR will be created. Follow the steps listed below to update your tool.

  1. Checkout the branch created by the workflow.

    git checkout <branch_name>
    
  2. Re-run the same openapi-generator command you used to generate the tool stub. If you started from an existing tool implementation like the one included in this GitHub repository, run the following command to update your tool to the latest version of the NLP Sandbox schemas (this command would be defined in package.json).

    npm run generate:server:latest
    
  3. Review the updates made to this tool in the NLP Sandbox schemas CHANGELOG.

  4. Review and merge the changes. If you are using VS Code, this step can be performed relatively easily using the section named "Source Control". This section lists the files that have been modified by the generator. When clicking on a file, VS Code shows side-by-side the current and updated version of the file. Changes can be accepted or rejected at the level of an entire file or for a selection of lines.

  5. Submit your updated tool to NLPSandbox.io.

Testing

If you started from an existing tool implementation like the one included in this GitHub repository, run the following command to lint and test your tool.

npm run lint
npm run test

For Python-Flask tools:

Preventing an NLP Sandbox tool from connecting to remote servers

The NLP Sandbox promotes the development of tools that are re-usable, reproducible, portable and cloud-ready. The table below describes how preventing a tool from connecting to remote server contributes to some of these tool properties.

Property Description
Reproducibility The output of a tool may not be reproducible if the tool depends on external resources, for example, that may no longer be available in the future.
Security A tool may attempt to upload sensitive information to a remote server.

The Docker Compose configuration included with this GitHub repository (docker-compose.yml) prevents the tool container to establish remote connection. This is achieved through the use of a internal Docker network and the presence of the Nginx container placed in front of the tool container. One benefit is that you can test your tool locally and ensure that it works fine while it does not have access to the internet. Note that when being evaluated on NLPSandbox.io, additional measures are put in place to prevent tools from connecting to remote servers.

Versioning

GitHub release tags

This repository uses semantic versioning to track the releases of this tool. This repository uses "non-moving" GitHub tags, that is, a tag will always point to the same git commit once it has been created.

Docker image tags

The artifact published by the CI/CD workflow of this GitHub repository is a Docker image pushed to the Synapse Docker Registry. This table lists the image tags pushed to the registry.

Tag name Moving Description
latest Yes Latest stable release.
edge Yes Latest commit made to the default branch.
edge-<sha> No Same as above with the reference to the git commit.
<major>.<minor>.<patch> No Stable release.

You should avoid using a moving tag like latest when deploying containers in production, because this makes it hard to track which version of the image is running and hard to roll back.

Benchmarking on NLPSandbox.io

Visit nlpsandbox.io for instructions on how to submit your NLP Sandbox tool and evaluate its performance.

Contributing

Thinking about contributing to this project? Get started by reading our contribution guide.

License

Apache License 2.0

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published