Mar 4, 2024
Yes, use local LLMs. No need for Tensorflow (to train something custom). You will use the HuggingFace transformers library for inferencing. You're dealing with sensitive educational data so best to not send it over servers via APIs. Unless you're experienced in security, but even then your institution will probably not allow it.
You can use LM Studio to serve a model locally and test different versions to suit your needs. I just discovered another tool that's in early development that can help you parse the documents (exams) to output a different file per student. It's called AutoNL: https://www.reddit.com/r/LocalLLaMA/comments/1b3xfbc/small_benchmark_gpt4_vs_opencodeinterpreter_67b/ <--- see this thread for a benchmark of a local model against GPT-4.