Find here the list of available scripts with a brief description. Find options at packages/courDeCassation/src/scripts
.
Send documents to the NLP API and retreive their annotations.
Import all documents to be pseudonymized from SDER.
Cleaning script (clean duplicated documents and other).
Clear Label database.
Delete all documents related data but keep users.
Delete all problem reports.
Delete specific document from Label db.
Count linked documents in Label database (chained documents that are in the same time in Label).
Display if documents are assigneted to multiple users (which is a bug).
Dump document data in the console.
Export all documents that have been treated (without waiting).
Export a specific document
Export treated documents (with the 4 days delay).
Export important "publishable" documents.
Calculate loss of the documents with the NLP API.
Free documents assignated to an annotator that is AFK after X minutes.
Manual import of a specific document.
Init db with test values (for local only).
Create a new user.
List caches.
List documents.
List documents with problem reports.
Purge db (for now only the users in statistics after 6 months).
If the NLP API was outdated or buggy, reannotate free documents. Warning: suspend nlp-annotation job during this operation to avoid side effects. This script only prepare documents and set their status to loaded, the next nlp-annotation job will reannotate them.
Renew the cache.
Randomise untreated documents, use only in local or in dev env.
Reapply route rules for documents with a specific status.