This uses python to load pdfs and then the anthropic llm to generate a course syllabus. Then, the syllabus is fed back to the llm with the pdf data to generate lesson plans and quizzes in json, etc. with each stage feeding off the next. What is great is it regionalizes the data for the country I will be teaching, Timor Leste, automatically.
This version loads data from files related to the question and sends that in the prompt to assist the answer. You could load an api reference doc and have the llm read it all for you and answer your human question;
The WebUI (Web User Interface) tool described in the guide allows users to interact with and manage open-source language models locally. Here are the key functionalities it provides:
Model Management:
Download various open-source language models directly through the interface
Load and switch between different models
Manage model files and configurations
Text Generation:
Interact with the loaded language model in a chat-like interface
Generate text based on prompts or questions you provide
Parameter Tuning:
Adjust various settings and parameters that affect the model's performance and output
Toggle options like CPU usage, batch sizes, and sampling methods
Fine-tuning:
Access tools for fine-tuning models on custom datasets
Customize the model for specific use cases or domains
Performance Monitoring:
View resource usage and model performance metrics
Multiple Interface Modes:
Chat mode for conversational interactions
Notebook mode for more structured input/output
Instruct mode for giving specific instructions to the model
API Access:
Potentially expose the model as an API for integration with other applications
Model Comparison:
Load multiple models and compare their outputs side-by-side
This uses python to load pdfs and then the anthropic llm to generate a course syllabus. Then, the syllabus is fed back to the llm with the pdf data to generate lesson plans and quizzes in json, etc. with each stage feeding off the next. What is great is it regionalizes the data for the country I will be teaching, Timor Leste, automatically.
https://github.com/ddtraveller/TEFLTools/blob/main/generate_course.py
This script uses google translation service, speech recognition and gpt4all to make a chat bot that can speak and comprehend pretty well;
https://github.com/ddtraveller/TEFLbot/blob/main/src/main.py
This version loads data from files related to the question and sends that in the prompt to assist the answer. You could load an api reference doc and have the llm read it all for you and answer your human question;
https://github.com/ddtraveller/TEFLbot/blob/main/src/main_RAG.py
It would be helpful to see a readme on the git repo you have.
https://yattishr.medium.com/unleash-the-power-of-local-open-source-llm-hosting-e33bf6a9679f
Interesting model manager.
The WebUI (Web User Interface) tool described in the guide allows users to interact with and manage open-source language models locally. Here are the key functionalities it provides:
Model Management:
Download various open-source language models directly through the interface
Load and switch between different models
Manage model files and configurations
Text Generation:
Interact with the loaded language model in a chat-like interface
Generate text based on prompts or questions you provide
Parameter Tuning:
Adjust various settings and parameters that affect the model's performance and output
Toggle options like CPU usage, batch sizes, and sampling methods
Fine-tuning:
Access tools for fine-tuning models on custom datasets
Customize the model for specific use cases or domains
Performance Monitoring:
View resource usage and model performance metrics
Multiple Interface Modes:
Chat mode for conversational interactions
Notebook mode for more structured input/output
Instruct mode for giving specific instructions to the model
API Access:
Potentially expose the model as an API for integration with other applications
Model Comparison:
Load multiple models and compare their outputs side-by-side