This is a vocabulary learning tool, which analyses the vocabulary structure in text and creates learning activities on the fly. This version is an adapted one, for the original one please refer to https://github.com/HaemanthSP/Vocabby. This adaption works on German texts and enables the learner to analyze domain-specific texts. Additionally the domain can be stated which changes the word vectors used in the process.
To add domain vocabular, navigate to Vocabby --> backend --> models and edit the domain_to_filenames.txt to map a keyword to the filename which it should refer to.
To create adapted models refer to the TrainModels directory
git clone https://github.com/HaemanthSP/Vocabby.git
git checkout 583af3c58ed526c762a103246d491e86f7bd8dfa
cd backend
conda create -n vocabby python=3.7.13
conda activate vocabby
Install dependencies,
pip install -r requirements.txt
python -m spacy download de_core_news_lg
cd backend
python manage.py runserver 0.0.0.0:8080
To install node,
curl "https://nodejs.org/dist/latest/node-${VERSION:-$(wget -qO- https://nodejs.org/dist/latest/ | sed -nE 's|.*>node-(.*)\.pkg</a>.*|\1|p')}.pkg" > "$HOME/Downloads/node-latest.pkg" && sudo installer -store -pkg "$HOME/Downloads/node-latest.pkg" -target "/"
to check the installation,
node -v
npm -v
then,
npm install npm@latest -g
After install node and npm. In a new terminal from root of the repo,
cd frontend
npm install
npm start
Access the application at http://localhost:3000